Back to the Cyberculture Archive



Willard Uncapher
paradox@actlab.rtf.utexas.edu
Dept. of RTF, College of Communications, 
University of Texas at Austin

Copyright (c) 1994 by Willard Uncapher.  This article may be freely
distributed throughout the net, but may not be reproduced in hardcopy
without permission.  Further, this copy is provided for personal use only.

[note also that the e-version suffers from the considerable loss of
formatting enhancements, particularly italics. This is a draft, so expect
a few grammatical faux pas, and a few things that need chaing.]

                     Community Networks and the Internet:
                Rethinking Policy in an Era of Geodesic Access

               Introduction: The Information of Organization

	The continuing fusion of information, telecommunications, and
entertainment industries is raising many issues of how to understand or
even describe the changing distribution of material and cultural
identities, opportunities and products worldwide.  How is this distribution
organized and how might it be best 'controlled' or organized in the public
interest?  Where is active intervention advisable or even possible?
Clearly, the disjunctive global cultural flows of money, media, technology,
symbols, even the physical movement of people in search of employment,
leisure, escape, or transformation are increasingly eluding traditional
organized policy and authority.  Nation states, with their fearsome
apparatus of geophysical control and their symbolic potencies seem ill at
ease, and they are becoming sandwiched between two regions of increasing
cultural and economic complexity and resistance.  On the one hand, these
transnational cultures and wayward information flows appear to be
engendering ever more powerful or elusive transnational corporations intent
of moving capital in accordance to abstract calculations of profit and
loss., moving companies to where they can find the right kind of work
force, the right kind of resources. and in accordance to calculations of
presumed economies of scale and intra-organizational synergies.  Cultural
products sweep from country to country as part of a transborder data flow
where data or disassembled videos and multi-media productions, coursing
along a myriad of pathways, are made whole again in some distant location.
How is a country to even set an agenda without the sovereignty implicit in
borders?

	On the other hand, the State seems even more ill at ease with the
increasing complexity, diversity, and resistance of the sub/populations
they propose to serve.  Moral crusades armed with implicit and explicit
force confront a host of ill-defined and unthought through challenges
involving 'crime' and culture.  When the population are involved with the
issues of war, a somewhat unified public face seems present, but when
citizens return to their diverse, local interests, they disappear all too
rapidly off the cultural map.  Isn't the State supposed to bring these
groups together through symbol and leadership, through cross-subsidy and
just restraint?  So much of the world is shattering into cultural war in
the name of geophysical unity for culture that seem ever harder to define
or police.  At the same time new kinds of global community are often being
forged, as part of the international youth cultures, as part of communities
of interest that transcend geophysical space.  British underground dance
DJ's, fleeing Thatcherist restriction against underground parties,
physically come to the US, take their fused sense of underground dance
music back to US to create yet other unexpected fusions.  Indeed, even at
the State tries to forge is symbolic unities, perhaps restricting  the
import, or production of some video tape, barely visible, multiply mediated
'hand to hand' networks, some virtual, some not, play to a different kind
of cultural supply and demand, part of what I have elsewhere explored and
named the emerging 'global grassroots infrastructure.' Clandestine VCR
tapes, electronic networks, private interest groups seem to be redefining
'community' and the kinds of identities and differences that communities
forge.  What does it mean that communities of interest and exchange can
form so independently of the shared geophysical world of the state, based
so often on machinery and tools they cannot themselves make, in
opportunities hemmed in by transnational calculations?  What is to become
of traditional communities?  What role might those of us who want to foster
communities empowerment play, and how can we even situate ourselves or
describe this rapidly changing mediascape?

	If we are to begin to answer these questions, I would submit that
we need to at least invoke the changing dynamics of higher levels of
organization. Less often thought about, perhaps, but equally profound is
the notion that institutions that serve to organize and redistribute all
this manifold cultural content are likewise products of communication
technology, even as they seek strategic ways to employ it.  From the level
of the individual relationships (e.g. Meyrowitz 1986), to those of the
nation-state (e.g. Appadurai 1990, Hannerz 1992) and the transnational
corporations, we see boundaries and hierarchies established to organize
access and resources, the foundations of power.  Power might be thought of
in terms of the access and distribution of resources, in terms of who gets
access to film equipment, to the Hollywood distribution systems, to patent
rights, to university degrees, to capital markets, and to the way we set
about creating and organizes access and distribution.  As will be explored
presently, some suggest that the 'nation state, ' with its presumed
homogenous cultural identity, its coherent, bounded borders, its collective
debate and collective action is a product of a media revolution begun with
the mass productions of the perfected printing press, and with the new
kinds of institutions and skills it helped facilitate. (Anderson 1991;
Hobsbaum 1990).  Traditionally key locations, certain cities in the global
ecumene, or certain strategic places in an organization, such that of the
'President' or 'Chief' have had sufficient knowledge about the organization
and its environment so as to direct many of the organizations functions
(they will be part of larger systems), reaping much of the wealth generated
by the system which they manage.  Systems within systems created layers of
informational and material wealth and poverty.

	Much of the power of the 'information age' lies with the computer,
a device which like some kind of automatic printing press and staff, works
to collect, evaluate, and manipulate data and connections with ever
increasing speed, flexibility, and accuracy.  Computers continue to
reorganize the traffic that courses the old communication and
transportation networks.  But how is this changing the contexts of
surveillance and control, of collective action and knowledge, of property
and virtual being?  Whereas I have written elsewhere of the impact of
computer based technologies on the politics of spatiality and community in
the global mediascape (Uncapher 1994; cf. Soja 1989, Lefebvre 1991), the
current essay seeks to explore changes in the geometry of the emerging
organization. It is the contention of this paper that we need to press
forward in assessing the implications of computer as an organizational
tool, both in how they organize information (such as in assembling music
samples), or in how they fit into complex social organizations.  Indeed I
will argue that these organizational aspects are so key to the ongoing
transformation of the mediascape, and to the associated transformation in
structures of power and wealth, that they overwhelm the importance of
capacity of the links shuffling things around, whether twisted pair,
coaxial cable, fiber optic strand, radio or satellite transponder, or even
in material carriers such as truck, train, boat, or plane.

	I will be arguing later in this essay that the structure of the
mediascape is becoming increasingly geodesic, a term I will explain at
greater lengths in a moment.  For the nonce, let me note that the term,
coined by Buckminster Fuller, famed as a designer, mathematicians, and less
well known as an social critic, speaks to the stability of a system in
terms of its multiple, overlapping redundancies.  He argued that the most
efficient organizational strategy in the known universe, found at the level
of virus, and capable of being seen in immensely large, stable structures
as well.  A dome, built on geodesic principles is capable of spanning an
entire sports stadium, or even a city.  Some of my readers, particularly
those familiar with telecommunications policy will anticipate my use of
Peter Huber's famous and influential report to the Anti-trust division of
the U.S. Justice Department (1984; 1992), entitled the Geodesic Network.  I
will not disappoint them.  Huber outlined how innovations and decreases in
the costs of switching were creating a new kind of geodesic competitive
market, undermining traditional switching hierarchies, and the kinds of
companies and regulations established and negotiated in terms of these
hierarchies. Indeed, I will briefly present a quick summary of his
arguments later in this paper.  However, I wish to provide a much overdue
extension of these ideas to examine the ongoing restructuring of the
information networks in the United States, primarily in the form of the
Internet. That I would broaden the application of geodesics to include the
global communications infrastructure is in fact very much in keeping with
Fuller's own analysis of the relationship of geodesics and stability, of
communication/ transportation hierarchies and the nature of power and
economic wealth.  Probably unknown to most telecommunications policy
analysts,  Fuller spent time as an analyst for Fortune magazine, and wrote
a great deal during his life about the relation communication structures
and the accumulation and manipulation of wealth.

	The body of this paper will explore the consequences of geodesic
restructuring on the organization of the Internet. The Internet is more
than a 'highway of information' linking large and small.  It had
traditionally been a somewhat hierarchically organized switching structure
with profound resemblances to the old switching hierarchies of AT&T prior
to divestiture.  What is happening to this emerging multi-media and
information infrastructure likewise bears profound similarities to the
divestiture environment of AT&T, and to the kind of geodesic structure that
helped to bring this change it.  Surprisingly little has been written about
this transformation, and the current paper hopes to jump into the gap and
to provide some much needed insight.

	After widening the implications of Huber's analysis to this broad
institutional and historical context, I will ground and refine this notion
of geodesics asking what these distributive changes mean for the
development of local community information exchange systems.  This
concluding section will draw on my experience helping out with policy
issues with the Austin Public Network/Internet Users Group, on interviews
and conversations I have had with other individuals who have set up or are
setting up community and business networks, and on a number of conferences,
both virtual and face to face that have been convened to examine community
networks.  By this point in the article, it should be clear why, for
example, centralized community information servers tend to be found in
communities which are only now coming online, such as in Blacksburg, Va.
whereas communities with more sophisticated online environment such as
Austin, Tx have not been quick to develop a single community information
server, and most likely never will.  Given my analysis, I will conclude my
paper by suggesting several policy principles that should be considered to
promote access, community, and communication.

            Part I The Geodesic Telecommunications Infrastructure:
                         An Information Highway?

	Metaphors have great power to organize our thinking, and the
metaphor of the information highway has been no exception.  Certainly, it
has helped to focus popular debate about the ongoing fusion of the
information, telecommunications, and entertainment industries, providing
easily comprehended images with which to ask questions about access,
subsidy, development, privacy, and so on.  We can pose the question of
'access' in terms of 'how rapid will the on and off ramps to this
(super)highway be' or the question of 'subsidy' in terms of 'whether we
need a public works project, a state sponsored highway or perhaps private
toll roads.'  It is a sign of a good metaphor that it can be so fecund as
to help organize such a diversity of issues, and the importance of making
these issues as comprehensible as possible to the general public in order
to encourage debate and participation right now should not be
underestimated.

	Still, sooner or later, as the nature of what the metaphor
describes begins to escape its old confines, begins to take on a new life-
horseless carriage, wireless telegraph- so the political and historical
assumptions implicit in an old metaphor surface.  If the present essay
works to reconsider the 'information highway' it is not because of a belief
that overuse of the metaphor has dulled more precise analysis of issues of
information exchange, access, and integration, but because we need to draw
back to ask what this metaphor is telling us.  We are reaching a point
where this metaphor is hiding or obscuring more than it reveals, in fact to
a point where, enantiomorphically, the limitations and biases of the
metaphor are become informative in a new and unexpected way.  The
'horseless carriage' and 'wireless telegraph' as metaphors should remind us
not simply how quaint the world might once have been, and how old metaphors
might blinded innovators to implications of their creations, but how
important it is to recognize that the developing social infrastructure of
the 'car' or the 'radio' might be presaged by the horse's movement, or the
radio's re-distribution of content and power.

	Highways are about hierarchies.  Some roads are bigger and faster.
Some are narrow and can't carry heavy loads or fast traffic.  We would not
want an 18 wheeler truck heading down local streets at 65 miles per hour
(100 km/h) for fear that the truck would damage the local streets and the
local life.  Neither would we want to see local street life, from chatting
with neighbors to teaching children how to ride bikes transposed to an
interstate highway.  Looking at data networks in terms of highways gets
most of us thinking in these terms.  Near by, there is the twisted pair,
the slenderest of telephone cables emerging from out of a house like a
small country lane, like a brook not far from its spring.  And then, far
from here, we know there must be the faster, backbones, high speed
networks, transporting vast amount of data from one region to another
wherein flow roaring torrents.  This big highways are getting faster yet.
Some of us might know that the federally funded High Performance Computing
and Communication (HPCC) network promises to experiment and produce an even
higher speed network than is currently available to existing 'backbones.'
That world far off seems enwrapped in the mystery of complex network
protocols and hardware, probably best left in the hands of the giant firms
and governmental agencies

	And yet this notion of this hierarchy, of the gradually ascent in
capacity is quickly and quietly being undermined.  In the topology of the
future, 'highways' can mysteriously appear in the center of the most
residential of areas, and yet be almost unknown by the neighbors.  Homes
themselves can increasingly serve to start redirecting that information
traffic: all one needs is the intelligence of a computer and a message can
be stored, or sent elsewhere.  The messages on a local bulletin board
system might have originated on many systems, with one local bulletin board
simply coordinating the flow of access and discussions.  Mysteriously, from
the perspective of the movement of media products, the solidity of the
'home' or 'office' and of the highway, of the information server and the
people using it disappears.  The material world still seems bound by its
strands and roadways, by centralized production companies, and economies of
scale, and yet the movement and manipulation of media products in the
virtual world seems increasingly volatile and ambiguous.  What is
happening?

                          Histories of Hierarchies

	One of the fundamental assumptions of this paper is that the
'geometrical' differences between the virtual and material worlds make for
policy differences between these two domains.  In the material world, great
importance is given to what I call center-periphery' orientation.  I would
not be disinclined to invoke the works of the global historian, Immanuel
Wallerstein who has long asserted that economic 'centers' use communication
media and political means to integrate peripheral economies into their
domination, creating a kind of global division of labor (Wallerstein 1987).
His centralization/decentralization themes extend the conceptual directions
of media historians such as Harold Innis and James Carey who have
investigated patterns of centralizing control and decentralizing expansion
associated with the historical development of communication technologies.
The telegraph and its news wire service, for example, helped to distribute
information about the world from central distribution hubs, even as they
facilitated the decentralizing movement of people and their cultures to new
frontiers.  The decentralized peripheries had to answer to the options and
the panoptic knowledge available to those who dominated the dense core
areas (Wallerstein 1990; Carey 1989).  Economic historian Ferdnand Braudel
speaks of a 'hierarchy of zones' within the world economy: "Every
world-economy is a sort of jigsaw puzzle, a juxtaposition of zones,
interconnected, but at different levels, with the core being advanced,
diversified, and urban" (Braudel 1984:39).

	One of the key elements of this older system of control lay in the
difficulty of assembling and interpreting information.  Where one
individual or group had to direct a complex variety of operations, a
division of labor and hierarchy was a profound necessity.  Hierarchy is a
way of handling complexity: break down the task into sub-units which can
handle the variety of choices and contingencies, and which can handle to
results of even smaller sub-units further generated.  Strategically, this
yields a few higher level nodes from which one can survey and guide the
whole.  Since the human mind can handle only so much complexity, it was
bound to create and utilize some topology of hierarchy in an attempt to
organize the resources and economies of scale of the material world.
Following the invention of writing, we find increasing number of social and
cultural empires founded on control of centralizing production, storage,
and analysis of information.  A class of scribes or mandarins report to the
reigning 'mon-arch.'  The invention of printing serves to both break apart
many of the old hierarchies, creating more localized, language oriented
divisions, and serving to extend and 'complexify' business strategies.
Capitalist hubs like the Netherlands and Northern Italy, the founding sites
of capitalism with its stock markets and complex accounting practices, also
soon developed information technologies such as the newspapers in the early
1600s which served to share the information being gathered by the ships
plying their way to its ports.

	Printing and mail delivery helped distribute information in such a
way that those with economic power could understand and strategize more and
more of market volatility from a panoptic core.  The onset of the
electronic revolution developed in two distinct phases. It is the
contention of this paper that while a great deal of thought has gone into
the study of the first phase, not as much has gone into the organizational
implications of the second.  As has been much commented on, the perfection
of the telegraph in the 1830s served to sever the connection of
communication from physical transportation.  Electricity, and the messages
that it carried could travel at the speed of light, and whomever could
access these lines in a meaningful way could, to paraphrase James Madison,
master the power that information yields.  While on the one hand this new
communication technology could help to decentralize society, allowing those
who moved to the frontiers to keep in touch with the distant urban core, on
the other hand the real effect appears to have been one of increasing
centralization and control (c.f. Carey 1989).  Since only a few individuals
could afford to gain access to these new technologies, or could be located
near one of the hubs, the result was one of increasing panoptic control.
Indeed, the rational management of Frederick Winslow Taylor proposed just
after the turn of the 20th century seems the apotheosis of this method of
control (cf. Beniger 1986).  Individual workers would be 'deskilled' so
that they would be interchangeable in a system of intelligent, rational
organization.  News stories spread out over the metallic wires, carrying
the telegraphic notices of distant events and markets.  The first phase of
the electronic revolution was one of enhanced but almost invisible control,
even if the popular electronic imagination saw mostly gadgets and the most
part, some enhanced interpersonal communication.

	While the electronic inventions of the telegraph and the telephone
severed the connection of communication from physical transportation, they
were limited from greatly decentralizing control because of the limits of
the switches.  In fact, since humans had to be sitting at the ends of these
lines, these new technologies helped to accelerate the systems of
centralized global control.  What's more, the movement of material objects
still had to course along accustomed pathways, subject to the logic of
centralization and control.  Highway systems are usually not developed by
local communities (alone) but as part of larger schemes of military and
economic control.  Where local communities did support the development of
highways, such as in the case of early toll roads, it was often to take
advantage of markets and opportunities coming from beyond the horizon.
Highways are not really part of the logic of local communities, per se.
They are built by the military and calculating agents of the larger scale
economy, and it is no surprise that so much of the interstate highway money
in the United States came from military funds based on the justification
that we needed to move troops from place to places rapidly as part and
parcel of our own defense.  Highways serve to move products rapidly from
one sector, one site of production, to another site of manufacture or
exchange, whether it is an 'electronic' highway, or one of concrete and
macadam.  What then does the image of the 'information highway' in this
context represent: an avenue for those cyberspatial early birds who need to
move the bits and bytes around quickly and efficiently, extending the
rationale of control across virtual communities and into material
opportunities.  Some might well argue then that the logic of the
information highway has been one of extending the logic of centralized
control into the discordant and ever localized communities of exchange.

	With this 'center-periphery' outlook two strategies emerged for
media reformers, critics, and renegades.  On the one hand engaged social
reformers might look to changing the distribution of information at the
various centers. In the case of cable television distribution, activists
sought to make public and educational access channels (PEGs) available.  In
the case of newspapers distribution, activists sought to at least make sure
that the major newspapers were responsive to a variety of views, or else to
set up alternative newspapers.  Where access was limited, as in the case of
cable, means to subsidize public access, the distribution of essential and
diversified information was sought.  Critical scholars, such as Adorno and
Horkeimer seem to romanticize the power of the mass media even as they
decry the culture of its owners.  One has the sense that, according to many
critical philosophers, in a better world, enlightened philosopher kings
might be able to grab the public bull horn and speak to all those in their
vicinity, encourage them to speak openly to one another.  If only the mass
could be unified and made aware of their own class interests, free of the
co-option of false consciousness.  So where then does this resistant
grassroots fit in a world where the mass media is disappearing, or at least
losing it universal characteristics of reaching everyone, of reaching out
at least to what Habermas might denote as the public sphere?

	On the other hand, more alienated dissidents might seek to move as
far to the 'peripheries' as possible, to what Hakim Bey has called
"Temporary Autonomous Zones" (TAZ), zones of temporary freedom and
experimentation that disappear when discovered by official powers (Bey
1991).  Even though the geophysical world can now exist under the scrutiny
of satellites, still within the fractal folds of the city, hidden in the
approximate maps of surveillance and control, there are communities of
freedom, folded in and out of site/sight.  These communities do not seem
themselves as part of finding new definitions and frameworks for
transforming the control of  the cross subsidies and physical power.  Their
resistance, rebellion in Camus' terms, is primarily to create sites of
heterogeneity and exploration and beyond the temporal and symbolic logic of
states, a fractal anarchism.  According to Bey, such communities have
existed throughout history but because they are not the ones who write the
official histories, they rarely appear as more than a wretched footnote.
Disbanded when discovered or overwhelmed, they will perhaps regroup
somewhere else.  Currently, rediscovering these (almost) hidden histories,
such as of the Pirate utopian community experiments in places like Nassau,
of the frontier 'tri-racial isolate communities' (African, European, Native
American) created by 'cultural' refugees from the Colonial America is
taking on new life (cf. Sakolsky & Koehnline 1993).  And of pertinence to a
geodesic portion of the social structure, Bey and others who study the TAZ
have noted the new possibilities for the creation of the TAZ using
electronic networks.  'Webs' and 'Counter-Nets,' proto and counter cultural
outposts, bound by affinity and shared differences develop within our gaze
yet invisible to our sight/site.

	With the invention and ongoing development of ever more
intelligent, automatic switches (computers) to distribute and exchange
information and media streams, it is increasingly apparent that we are
seeing the beginning of a new phase of the 'electronic era' and the end of
the era of dominated by centuries of mass media. (cf. Neuman 1991).  Media
and cultural theorists are scrambling to theorize the nature of this
'post-mass society' whose onset has occurred so suddenly.  Metaphors from
the old ways of thinking continue as the new is conceptualized with the
concepts of old.  Radio was first thought of as a wireless telegraph, and
its implication for broadcast to a mass audience was unforeseen.  The
telephone was simply 'Bell's plaything' and the patent right to the
invention were turned down by Western Union Telegraph since the notion of
wiring so many homes with the new invention was unforeseen: the car as a
horseless carriage.  What is happening to our media structures is more
profound than can be captured by thinking of the ongoing cultural change as
'narrow-casting replacing broadcasting.'  We see in the mirror of
cyberspace strange resemblences to the world we leave behind, and we take
with us our notion of Euclidean geometry when perhaps we should take out
our books on mathematical topology with its theories and lemmas of
multi-dimensional mappings.  We borrow from its sense of multiple
communities to help reform our geophysical communities, to restrategize
corporate futures, but there is a strange logic going on in this online
world, and as we use it to reconstitute ourselves, it subtly changes us.

                  A New Geometry of Telecommunications

	The notion that the computer and digital media are changing the
very pathways by which communication systems are structured and undermining
traditional hierarchies, has of course been in the eye of telecommunication
policy makers for a number of years now.  Peter Huber, writing his now
famous summary of the evolution of the telecommunications infrastructure
for the US. Justice Department, spoke of the overall design of this new
topology, of this new science of mapping and connectedness, as a 'geodesic
network' (Huber 1978).  Seeking to articulate the changing contexts for
regulations and market competition in US telephony after the AT&T
divestiture, he analyzed the telecommunications infrastructure into three
essential components: lines, switches/nodes, and an overarching regulatory
structure. When the cost of lines was low and the cost of switching high,
the optimal organizational topology was to make use of a switching
hierarchy, and in the case of the U.S. telecommunications infrastructure, a
5 tier hierarchy of switches, with the smaller class 5 switch huddled down
at the local level, all the way up to the stratospheric class 1 mega-switch
capable of organizing and moving vast amounts of data between networks.

	In this kind of hierarchical system, data would be moved only to
the most powerful switches necessary.  When the cost of switches came down,
and the comparative cost of the lines and transmission went up, the optimal
topology was to have the information take the shortest route possible, and
to make use of the intelligence of the switches to find that route.  It was
not always to outsiders that in fact these switches were nothing more than
computers, and computers nothing more than a kind of switch.  Was it any
wonder that it was AT&T that developed the transistor, and laid the
foundation for integrated circuits?  Rather than have an operator sit and
physically pull plugs on a giant board as was done up to the 1920s, the
telephone companies began to develop automatic switches (so the caller need
only dial the telephone number to tell the switch how to operate).  This
telling remote machines how to operate, and how to make connections was now
integrated into the complex hierarchical system whereby ever more complex
'negotiation' between the 'caller' and the complex decision making of the
switches relating to one another.   The government's need to develop its
own calculating machines, and vonNeuman's 'CPU' architecture crystallized
into a new generation of switches, a kind of universal 'computer.'

	With the development of packet switching during the 1960s,
individual switches within the network began to decide almost for
themselves how to send messages, breaking them into small 'packets' with
addresses, and then sending them across optimal, ever changing pathways to
a final destination where all the packets would be automatically
re-assembled.  If there was a mistake in transmission, if a packet was
missing or garbled, then the switches were powerful enough to call back
through the networks to ask for the information to be resent.  The result
was a vast increase in network capacity, and a new flexibility.  As the
military who had funded much of this researched had hoped, the
telecommunications network had developed so that if some switch went out,
say by some catastrophe of war, then the information traffic could be
automatically routed around it.  Traffic need only go over the temporarily
open line, and not have to keep open a single link from source to
destination.  However, this new transmission method not only made the
telecommunications network much more efficient and more resilient to
network damage, it also began the process of 'flattening out' the network.
Rather than using a hierarchy of switches, the networking strategies
increasingly concerned themselves with the ways to have the switch packet
up and discern the optimal path in the shortest amount of time possible.

	Huber called this new topology 'geodesic,' using the term of
designer and mathematician Buckminster Fuller to describe how the strength
of the system could be achieved not through structural hierarchies, but as
the sum of the constant play of many strengths and resiliancies (or
'tensegrity') of the many interconnections forming the whole (cf. Fuller
1975:373-431; 1979:165-186).  Buckminster Fuller showed that the more
structural elements there were to the geodesic structure, the stronger it
was.  Structures that were based on hierarchical designs are fundamentally
unstable since the failure of any supporting structure would compromise the
integrity of the whole.  Such systems had to be built with a great number
of redundancies to compensate for their structural shortcomings, and even
then they eventually were ruined by time and entropy. It is no wonder the
foundations of traditional houses and other such buildings have to be so
strong, nor that there have to be so many crossbeams and supporting struts.
A crack in the foundations can mean the collapse of the house.  The epitome
of a geodesic structure, however, is of course the geodesic dome, a dome
which has a redundancy built into every link so that it can maintain its
overall integrity even as it loses many of its component parts.  Jay
Baldwin who helped build Fuller's own geodesic dome on his island in Maine
told me that the key problem he had had was how to actually secure the dome
to the ground.  Since the dome itself was so sound it could be lifted up
and toppled by strong New England winds, like a sail in the wind  (Baldwin
1991).  Fuller spent much of his life demonstrating that this kind of
geometry was the basis of natural structural integrity and stability, from
the virus to interplanetary structures (much as fractals, non-linear
iteration, and principles of chaos now appear to be at the basis of natural
growth).  The human world is littered with the ruins of hierarchical
buildings and structures, tumbled down by their own weight.

	Huber drew on Fuller's legacy for his telecommunications policy
study, drawing attention to the fact that each of the nodes, the end-points
of the telecommunication network might turn out a link to somewhere else in
the network.  Like a magical house where entering a closet leads one into a
bed room far away, so the terminal nodes might be connected to their own
networks.  The actual ends of the telecommunications networks were becoming
fogged in uncertainty.  End users could hook a computer up to their
telephone line, set up an electronic bulletin board, and then actively
coordinate and switch information (stored or synchronous (chat)
conversations, data, images, etc.) themselves.  End users could set up
their own local telecommunications network for their organization.  As the
intelligence of the network spread out during the late 60s and 1970s from
the center, from the massive class 1 switches out to the frontier of ever
more intelligent 'desktop' computers and other equipment, so the regulatory
structures based on a centralized, hierarchical command-control-communicate
overview grew obsolete and overly restrictive.  Bypass became almost a
certainty as pathways out from and back into network could be established
that circumvented established regulatory conventions.

	The telephone companies of old, those corporate structures that
served to organize all the connectivity and flow were beginning to
disappear into the world of their lines and switches they had set up.
Complex systems of interconnection began to allow potential competitors to
AT&T such as MCI and Sprint the ability to provide additional lines and
switches at more competitive rates to users, and often with services that
AT&T had not yet developed or brought to the market.  AT&T meanwhile chafed
under the increasingly outmoded regulatory regime, especially with its
attempt to make distinctions between basic services and  'enhanced
services,' with its many line of business restrictions.  The rise of
telephone companies like MCI was made viable not because they intended to
duplicate the entire AT&T network, but because with the shift to more
intelligent switches, and with the switches themselves being more
decentralized and distributed throughout the network, interconnection
became possible, even preferable.  The new telephone companies were made
possible because with the cost of switches falling and the comparative cost
of lines and transmission rising, new more specialized links within the
overall telecommunications structure could be attempted..  As the old
virtual hierarchies gave way to the increasingly geodesic flattening, so
new entrants appeared to strengthen the tensegrity of the overall stately,
informational dome of an otherworldly Xanadu.

                             Part II 
         A Geodesic Information Infrastructure: The Internet

	Surprisingly, little work has been done to extend Huber's notion of
the geodesic telecommunications structure to that of a more generalized
notion of a geodesic information structure, and it is therefor no surprise
that architects of shared virtual community spaces are likewise unclear of
the environment in which they are building.  The issue of balancing the
cost of lines and switches remains a critical issue in designing services
in data environments as complex as that of the Internet.  This is further
complicated by more complex parameters in regards to differences in line
capacity, service reliability, network openness, and so on.  Not everyone
wants the same thing with this information infrastructure, and these
differences can be capitalized on as connection companies try to package
special rates to bundle together these different needs.  The overall sense
of this rather dense assertion can be clarified by looking first at the
broadest and most talked about of information networks these days, the
Internet.

	Defining the Internet simply as 'that big network of information
services with a government history' is about as revealing as suggesting
that the telephone network is simply 'that network that interconnects our
voices.'  We need to be much more clear about what we are talking about.
Minimally defined, the Internet can be defined as that data system linked
together by TCP/IP, by the Internet connectivity protocol which makes such
services as remote access (Telnet, finger), file transfer (FTP, email), and
the various menu access and retrieval systems (Gopher, Wide World Web,
etc.) possible.  I need not delay here to provide some sense of the history
of the Internet since there are so many documents both on-line and off that
provide different perspectives on the development of the Internet and its
protocols (Sterling 1993; LaQuey 1993; Quarterman 1992; Krol 1993, etc.).
Suffice it to say that originally it was a data network designed to link up
the Military, its Industrial Contractors, selected research universities,
as well as a number of research computational devices (such as
'super-computers'), making use of connectivity protocols that could
withstand nuclear or terrorist attack to any particular line or switch.
Suffice it also to say that many of the services on the Internet, such as
UseNet, have their own complex, independent histories which only gradually
became fused to those of the Internet.

	This well known overview desperately needs to be supplemented by a
more detailed understanding of the actual organization of the Internet.  In
recent years, the Internet has resembled the pre-divestiture AT&T with the
federally funded NSFnet providing the central Internet 'backbone' high up
at the top of the switching hierarchy, leading down through more and more
branches to distant terminals (computers), much as the old Class 1 switch
would lead down to more and more regional inter-connections, finally to the
Class 5 switch not far from the user's home or business.  This NSFnet
backbone which provided the highest capacity connectivity and which helped
to coordinate and transmit the overall flow of information and data was run
under NSF contract by ANS (Advanced Network & Services, Inc.), a consortium
made up of the Merit Corp. of Ann Arbor, Michigan, IBM, and MCI.  The
Mid-level networks in the United States, such as Colorado SuperNet, New
England's NEARnet, the Mid-Atlantic's and South's SURAnet, the Midwest's
Midnet, MOREnet, and ArkNet, Texas' Sesquinet and THEnet, to Westnet,
NorthWestNet, and so on have provided more regional interconnections.
Finally, more local providers yet such as Universities and research
institutions (.edu), military bases (.mil), commercial sites (.com),
government agencies (.gov), and specialized organizations (.org) and
networking corporations (.net) provided still more local connectivity.
This final node on this chain of command would be the local 'machine' on
the Internet, labeled by its Internet IP address.  Caught out beyond the
Internet proper one could catch glimpses of other, more distant networks
and connections, some consisting of simple connections from homes via
modems and computers, some connections via local area networks (LANs),
sometimes with infrequent connections to giant private networks run by the
larger computer and telecommunication giants, often running with
incompatible sets of protocols.  Just like the plain old telephone, the
ends of the information network were being linked to something else.

                 1.  Gateways beyond the horizon: 
             A Specialized Public Wanting to Communicate

 	While this might appear to resemble the AT&T system of days past,
with its various service zones, its long distance carriers, its local
carriers, and so on, several issues must be understood.  First, the
Internet is only one data network (highway) among many.  Many other
networks exist, such as the international store-and-forward Fidonet
computer network.  In fact, FidoNet still reaches more people in more
nations all at a cheaper cost than the Internet (Dodd 1992; Bush 1993).
While popular magazines suddenly discovered the Internet sometime during
1993 and hung their expectations on the 'information infrastructure' and
the 'information super/highway' on its frame, the Internet existed as one
part of an emerging, interdependent network.  Indeed, since the costs of
joining this elite network were high, and the restrictions many, unheralded
by the magazines in their search for a new mass media, a diverse set of
amateur networkers were developing a resilient sets of networks.  Consider
Fidonet's email system.  If I wanted to send a note to Tashkent in
Uzbekistan, Central Asia, I could certainly do so to the at least six BBSs
listed there and at minimal cost via FidoNet (say to "Hacker's Trash BBS"
run by Dimon Kalintsev). At an appointed mail hour, all the Fidonet systems
in one of the three Fidonet global time zones close down to the public and
begin to communicate computer to computer, sharing mail, files, and group
conference records.  The computer collectively share mailing tasks amongst
each other.  Fidonet is similar to the Internet in the way it allows
different nodes to share in overall connectivity, but different in that
Fidonet systems are connected to the larger network only during the
appointed mail time, not continuously as are most Internet systems.

	Fidonet is only one of several 'store and forward' systems linking
up the some 53,000 public bulletin board systems in the US, not to mention
many elsewhere in the world (Rickart 1993:13).  While the global Internet
has remained in the provinces of the Universities, large corporations, and
the local national governments, grassroots connectivity continues to
burgeon.  At a point when the Internet had barely penetrated Argentina, the
regional Fidonet Hub, TangoNet was already active, exchanging mail,
programs, group debates, and so forth with the rest of the world  (cf.
Quarterman 1991).  To some, grassroots systems like Fidonet can be
considered the harbingers of broader band, continuous connectivity to
follow, managing to penetrate where more capital intensive, Internet like
connections can reach only with expense and difficulty.

	Other store and forward networks exist to serve regional and or
organization purposes.  Depending on how open one wants one's network, a
set of bulletin boards with their own topical interests might set up a kind
of budget 'extended local area network.'  Many of the echoed discussions
groups on FidoNet circulate only within a particular region or in
association with particular events.  In Montana, the Big Sky Telegraph has
promoted local FidoNet regional networks to share regional information so
as to lower telephone costs.  The shared exchange of local electronic
dialogue and information is kept as inexpensive as possible this way, with
the Big Sky Telegraph machine itself connecting these local groups to the
larger (electronic) world via the Internet.  For those seeking broader
connectivity, Fidonet operators, following the lead of amateur radio
operators have established satellite links to carry conference feeds.
Planet Connect of Newport, Tennessee currently provides a 19.2K baud feed
of some 15 to 20 megabytes of Fidonet conference and UseNet feeds for about
$30 a month, capable of being received by C or Ku band dishes (the initial
dish costs about $500).  Messages would then be sent upstream by more
conventional forwarding techniques.

                  2. Gateways beyond the horizon: 
		Corporations Wanting to Communicate

	Corporations and governments with a host of special needs, such as
in terms of capacity, security, reliability, price, or connectivity, were
forging their own links.  Companies such as DEC, IBM, Texas Instruments,
and many branches of the Federal Government have long maintained their own
'internal' networks, offering a variety of services from the more limited
use of email, to the much more complicated engineering task of establishing
remote access, for hundreds even thousands of computers.  TCP/IP is not the
only inter-networking protocol, only one of the more successful.  Local
area networks (LANs) have established a number of access topologies (ring,
star, etc.) by which computers linked together with a high capacity cable
might contact or poll each other, sharing information and resources.  The
software that supports this connectivity, such as Novell's Netware, assumes
that the cost of linking these machines together is low since the lines
tend to be relatively local, within or between rooms or local buildings.

	However, increasingly LANs are being welded together into WANs,
wide area networks that in turn link together connection dense and
inexpensive LANs with generally expensive connections.  These WANs can be
global in their operations, linking up thousands of computers and LANs
worldwide. Texas Instrument's WAN links together over 100,000 devices
throughout the world making use of rented satellite transponders, and a
variety of cables and leased connections (Smith & Udell 1993).  Because of
the continued connectivity expense, WAN technology continues to innovate in
issues of bypass using fixed or leased lines, semi-dynamic packet switched
lines (X.25 and now 'frame relay'), and now where available, dynamic
circuit switched connections (via ISDN).  This almost external force pushes
and pulls at the edges of systems like the Internet, sometimes interfacing
with it, sometimes carrying some of its traffic, sometimes avoiding it and
its costs all together.  WANs take us back to when transport was more
expensive than switching, and their culture is one of the few remaining
that is reminiscent of the old mainframe culture.  Because the LANs are
used for sharing so much data in their connection rich environment, the
WANs that connect them have to try to find ways to pare that traffic down.
While the LANs are associated with "small groups of like minded people,
where the cardinal values are quick development, high functionality, and
access to resources, the culture of WAN management "has evolved to serve
the needs of a large and extremely diverse set of users.  Security, data
integrity, and highly scalable performance are what count here." (Tibbetts
& Bernstein 1993)  WAN culture in contrast with LAN culture must have its
connections planned ahead of time, must take issues of security, archiving,
and reliability very seriously, whereas LAN culture is known for simply
setting up a connection between a group of computers as a kind of
spontaneous, ad hoc exercise.

	Typical Internet connections making use of occasional file transfer
tend to make far fewer demands.  WANs tend to have a more hierarchical
ordering as they organize limited connection resources.  As LANs and WANs
gradually merge (only a dream now), and their short haul and long haul
methods and protocols are reorganized, and as they get to take more
advantage of the innovative topologies in switching, we should see a
flattening of connective hierarchy as well.  For now they work to extend
the Internet from the outside, playing geodesically with connections within
the Internet as well.  Speciality networks like WANs continue to expand and
strengthen the structural integrity (or tensegrity) of the whole.

		 3. Gateways beyond the horizon: 
                 Computers wanting to communicate

	As the number of data networks continued to expand, so their links
have continued to become more seamless and automatic.  Fidonet is gated to
the Internet and thus anyone on the Fidonet can send a message to someone
of the Internet, and vice-versa.  Published books such as John Quarterman's
The Matrix have done their best to keep up with how to address mail to
different networks, trying to keep up with their very existence of these
alternative networks (1990; also Frey 1991).  On-line documents rather than
published books are increasingly shouldering the burden of identifying and
negotiating these dynamic links since they are changing so fast (cf. Yanoff
1993).  Alternative networks and services might include not just Fidonet,
but Applelink, AT&T mail and their Easylink, DEC's Easynet, the academic
Bitnet, Alternex (Brazil), Glasnet (Russia), the Web (Canada), etc. While
some of these links depend on email like store and forward processing,
others make use of ISDN like interconnections, capable of handling more
information.  For example, Telecom Canada's Envoy-100 commercial network
uses the international X.400 address system to facilitate fast, relatively
broadband connections with neighboring networks.  As with Peter Huber's
phantom endpoints, each 'final' node might in fact be a gate to another
network, another set of connections.

		Establishing Appropriate Capacity

	In exploring the similarities between these communication networks
and the pre-divestiture AT&T, we need to realize that unlike the earlier
telephone system that for the most part simply exchanged voice connections
which made few demands on the carrying capacity of the old twisted pair,
the copper wire going to the home.  Now the networks can vary as to their
bandwidth, from the still lowly twisted pair, to T1 (1.5 Mbps) to T3 (45
Mbps), and even higher in primarily experimental or dedicated connections.
At this point the NSFnet backbone runs at T3, a rate which could move data
at a speed of 1,400 pages of text per second.  At that rate a 20 volume
could be sent across the net in 30 seconds. (Hart 1992).  When David
Blair's movie "Wax: or the Discovery of Television by the Bees" was
transmitted recently over the Internet, only connections close to the
faster portions of the Internet could really pick it up.  While the
increasing investment in fiber optic broadband will make faster connections
increasingly available, we need to consider what is happening in this
transition period to broadband, and what this transition period indicates
about future.  Pricing with so many factors such as reliability, capacity,
service arrangements, etc. can be something of a speculative art.  Indeed,
many of the earlier competitors to AT&T, such as Sprint sought package
rates in new ways based on new service combinations.  Sprint would lease
voice lines from AT&T and then send data over them.  Since data took up
less bandwidth than most voice, and could be packeted and switched in
smaller packages, Sprint would profit from the difference in these rates.

	Until fiber optic and other broad band networks become more
universal, cost and capacity will be intimately related.  Whereas a 128 or
64 kbs WANs might need special arrangements to get the best tariff rates,
such as renting a 'dedicated line,' establishing an ISDN virtual network,
or renting a satellite transponder, many rural areas might find that there
is real payoff in simply expanding the uses of traditional telephone and
cable lines to the fullest, something that will be discussed later in this
paper. With this being  the case then there will need to be education to
aid administrators in determining just how optimize the information
infrastructure until the fiber optic/broad band/ gets there.  And even
then, cost will be an issue.  However, we now need to consider how
transformation at the mid levels of the Internet and related networks are
also transforming just how high the costs might be.

		Transformations at the Mid-Levels: 
		     The Shattering Begins

	With stage set like this, with a top heavy switching hierarchy amid
the growing technologies of bypass, was it any wonder that information
users were constructing new kinds of connections?  The digital MCI's were
at the door, and door was open.  Within the Internet itself, a number of
forces have been pushing to undermine its traditional hierarchies,
flattening the hierarchy into a geodesic structure. With the increase in
activity and demand for Internet access, there has been considerable growth
in private, for-profit Internet backbones.  The popular notion that the
Internet is primarily devoted to educational, research oriented, non profit
traffic has long been patently false.  As early at 1991, over half the
Internet traffic was commercial, and by late 1993, over 80% of the traffic
was commercial.  Only traffic on the basic NSF backbone was limited to
non-commercial traffic, and this restriction was causing a lot of bypass.
In fact with costs and usage rising, the Internet has long been moving
towards some kind of privatization.

	Already, the main source of federal funds to the Internet, the NSF
(National Science Foundation) has announced that it will be shedding much
of it funding for the more 'public' backbones that had been organized and
run by ANS (the Advanced Networks Services), presumably by April, 1994.
Instead NSF will concentrate on developing an even faster research network,
as well as developing the backbone for the National Research and
Educational Network (NREN).  When in 1991 the ANS prepared to charge its
mid-level networks connection fees, a number of the independent and
mid-level providers, such as Alternet, PSInet, CerfNet, and Sprintnet
retaliated by organizing CIX (Commercial Internet Exchange) to promote
commercial 'IP' connectivity.  Other privately funded backbones have been
appearing in Canada, Europe, and in the rest of the world (Deutsch 1993:82,
Community Information Exchange 1993).

In many respects the establishment of CIX was none to late in coming.  As
early as 1990, ANS had been pushing towards privatizing the network itself,
and being a kind of monopoly carrier to both subsidized public research
connections, and the unsubsidized commercial connections, promising to
figure out a way to keep the two digital data streams apart when it came to
giving priority to the subsidized traffic.  As Gordon Cook has written:

  With planning for an NREN going forward in Congress and competition there
  between the Department of Energy and the National Science Foundation for
  lead agency, the NSF had ambitious plans for continued growth of the NSFnet
  backbone. Unfortunately for it [it] had little money with which to fund an
  upgrade to T-3 speed.  At some point in early 1990 with IBM in the lead,
  MERIT came up with a plan to create ANS as a non profit operator for a new
  high speed privatized commercialized backbone.  The NSF was asked to accept
  the privatization of the backbone by means of a realignment of the
  cooperative agreement with MERIT.  Short of terminating the cooperative
  agreement and the politically unthinkable course of immediately putting the
  backbone up for rebid, it had no other recourse but to accept the terms
  offered.  (Cook 1993)

The situation has similarities to the divestiture of AT&T.  The many
mid-level networks mentioned above were spun off from the federally
subsidized NSFnet and re-attached to the privatized backbone.  ANS set up
CO+RE (Commercial plus Research and Education) subsidiary to handle the
commercial traffic.  Like AT&T, the old ANS seemed interested in profiting
from the new trend in telecommunication.  This is not surprising
considering that the ANS consortium, made up of IBM, MCI and Merit were all
now in an ideal place to capitalize on their knowledge of how the data
networks run.  Again, what complicated matters, however, is that part of
their business was still being subsidized by the government (i.e. the
taxpayer), while other revenues were being generated by access revenues,
much as AT&T had tried to separate its basic, regulated services from its
'enhanced,' often unregulated services.  ANS declaration that they would be
able to tell which data stream through their switches was which, whether
from their private or their subsidized connections, and to give priority to
the subsidized research connections, in fact was not true:

  In allowing ANS to sell direct access to its own network (ANSnet) that used
  the same physical facilities as NSFnet, the NSF spoke of two virtual and
  presumably distinct networks. It properly insisted that commercial traffic
  placed by ANS on its network not diminish bandwidth needed by its own
  customers.  However ANS's January 15 1991 Proposal to the NSF made clear
  that once dumped into the network packets from its commercial customers
  would be indistinguishable from those of the NSF's customers (Cook 1993).

It is little wonder that the NSF proposed withdrawing direct support of the
networks, concentrating simply on subsidizing the creation of newer
advanced high speed networks, and to more direct grants to various carriers
and services.  The future direction of this funding, and even the notion of
NREN, one of the main funding pipelines established by the High-Performance
Computer Act S.272  in December 9, 1991, still remains unclear. The fact
that funding for the government backbone is going to end in April 1994
should not be so surprising given that the funders can no longer be sure
what it is they are funding. Indeed, according to Peter Deutsch, "Traffic
on the United States Government funded National Science Foundation (NSF)
backbone (once the core of the entire worldwide Internet because of its key
role as a transit point for traffic between third world countries) is now
dwarfed by traffic on private portions of the Internet, including such
groups as that operated by the CIX. (Deutsch 1993; cf. Rugo 1993).

While to the general public it seemed as if the government was simply
abandoning the NSFnet backbone just as the US was beginning to develop its
'Information Highway,' in fact the highway was beginning to fractally
decompose into a myriad of connections and strategies, and the old hands at
ANS were among the first to want to take advantage of the new situation.
The often heard debate as to whether the Internet should be a public
resource stressing public goods, or a private enterprise stressing
efficiency misses the point: the answer is a combination of both.  The
question as to whether we (whoever that is) should turn the Internet
created out of public tax dollars to private interests, part of that
eternal capitalist logic of 'privatizing the profits and socializing the
losses' can not be simply applied to a network as hybrid and complex as the
Internet.  We need, in fact, to look at the organizational level.

	In Austin alone, direct Internet access continues to broaden.
While one can go the route of hiring access to the Internet via the
University of Texas, such as is done by the several local software firms,
such as Quadralay, access can now also be hired from various packagers
(such as Meritnet, Inc.) using lines of SprintNet, Compuserve, HoloNet, and
UUnet/Alternet.  As will be discussed later, the Austin based Research
Consortium MCC (Microelectronics and Computer Technology Corp.) is
developing  its own access and information sharing network for regional
companies: EINet.  The Texas Department of Commerce likewise is seeking to
extend EINet to Open Network Enterprise (Texas-One).  James Ullrich, of the
Austin Independent School District told me that they were linking up all
the High Schools in the Austin area with fiber optic strands beginning the
Fall of 1994 and finishing by Summer of 1995.  The School district decided
that it would be cheaper to drop of the Southwestern Bell Local Exchange,
and set up their own telephone system.  Since fiber is cheaper is generally
cheaper than cable, they decided to develop a system they could grow into.
They then brought other governmental and public institutions onto their new
network, which will become an Internet node by the Fall.  I was told they
didn't even really know what to do with all the switching and access they
were about to get.  Other Austin companies spoke of CAPs, competitive
access providers, as ways to bypass the local Southwestern Bell.  The
Austin Independent School District plans to provide direct Internet
gateways, bypassing the cost and congestion of University of Texas links.
All of these new access routes point to a flattening of the switching
hierarchies and a more geodesic informational infrastructure.
Transformations at the Higher Levels The Shattering Continues

	By January 1994, however, ANS who had wanted to be the first to
capitalize on the new connectivity, finally capitulated instead and
announced that it too intended to join CIX, supporting the project of open
networks.  No sooner did this happen then within days, ANS customers began
announcing that they would no longer give ANS their exclusive contracts.
The California research network BARRNet, announced that as of January 1,
1994, it is withdrawing from ANS CO+RE service in favor of a T1 CIX
connection because:

	1. We have not been happy with the results of the ANS/CIX
  arrangement that was supposed to guarantee symmetry for traffic between 
  pure-research sites. Since there appears to be no way for ANS to fix this
  without cooperation from other network providers and since we can implement
  the same routing policy in place now with a direct CIX connection at much
  lower cost, we have decided to install the direct connection.  
	2. With the recent decision by ANS to join the CIX, it is now
  possible to route traffic to other ANS CO+RE customer via the CIX rather
  than having to make special CO+RE arrangements on behalf of BARRNet
  subscribers.
	3. There have been a number of cases where using ANS as a transit
  path to the CIX has made it difficult to resolve certain routing problems.
  A direct connection from BARRNet to the CIX should improve our ability to
  troubleshoot connectivity in such cases.  (BARRNet Announcement, January
  1994)

	Bit by bit, the old hierarchies of service organizations are
flattening out, and connectivity at all levels is expanding.  But how is
this affecting local communities?  What kind of policy might be suited to
this emerging geometry of distribution?  With the concept and notions of
the geodesic behind us, we can finally turn to strategies.  

				Part III 
	            New Contexts for Community Networks:
	     Rethinking Politics around the Pot Bellied Stove

	The model of the single electronic bulletin board holding forth as
a kind of electronic town hall makes sense in a world once organized by
mass media, where one central location can serve as a meeting place for the
diverse elements of the community.  That is not to say that mass media need
to have organized the meeting, but rather that there would be one physical
place which would reach out to the many.  At that central place different
constituencies could be bound into a community, as might a roomful of
Vermonters, warmed by a stove, talking collective politics, shame the
obstreperous into silence with a scornful glance, but giving way as well to
the uncommon and unconventional.  Now the rooms in virtual kind of town
hall might lead anywhere, and be anywhere: local and not local begin lose
their coherence as online descriptors, at least as linked to geophysical
communities.  Instead of having one meeting place, why not have all the
different conversations echoed onto different local systems, each of them
serving their own constituencies?  But what will the level of discourse be?
What will the content be?  Who will be left out?  One of my rationales in
writing this article has been that only with a sense of the developing
geodesic infrastructure can these questions of sustaining geophysical and
virtual communities be approached.

	The principles I believe we need to consider in empowering
communities in an era of increasingly geodesic access will be formulated in
terms of are: unbundling or bypass; rebundling or interface; content
provision; skill provision; and appropriate service.  We might think of
these as: taking apart, putting together, getting something on the nets,
getting something off the nets, and making sure it all connects.  Based on
space considerations, I will only allude to the symmetries at work here,
and will deal with these as stress points in the making of geodesic
telecommunications policy.  These principles should are the key factors in
organizing this structure, and in understanding what this structure is
organizing.

  	 1. Open Networks, Alternative Networks and Bypass: 
                    The Logic of Unbundling

	The importance of bypass and inter-network competition to the
community network cannot be understated.  Should there be one subsidized
'pipeline' to the community service, however defined, or a competing
selection among such carriers?  Should we mandate interconnectivity as
being in the public interest?  The following anecdote should serve to
introduce this point.  Frank Odasz is the founder of Big Sky Telegraph in
Montana which sought to link up the 114 one room school houses in Montana,
and to integrate this connectivity with the business, civic, and general
public.  After an amount of success (I was one of its researchers), he then
wanted to link the resulting system up to the Internet.  His area mid-level
Internet carrier for Western Montana was NorthWest Net.  When he applied
for Internet access, however, NorthWest Net replied that if he included
non-educators on his system then he would have to pay business rates, even
if he wasn't trying to make a profit.  Tear the 'wide area community'
system apart, NorthWest Net urged.  Odasz in turn argued that he was trying
to be an educator to the whole state, and wanted to include all people in
this educational experiment.  He then turned to Colorado SuperNet, which
handles Wyoming, WestNet (also in Wyo.), and MidNet, which handles people
in Nebraska.  MidNet accepted the proposal and was ready to provide Big Sky
Telegraph Internet access using a dedicated 9600 baud inter-state
connection. When NorthWest Net saw what was happening, they relented and
provided Big Sky Telegraph its connection.

	Big Sky Telegraph sought to take advantage of educational rates of
the old mid-level ANS carriers, and played these carriers off one another.
However, there are national services connected to SprintNet, CompuNet that
might have provided bypass as well, (and increasingly will).  As the market
for these kinds of services heats up, we can expect more such competition.
Had the Telegraph the option of only one 'subsidized' carrier, they might
have not gotten their lower rates.  This kind of competition continues to
grow, with carriers changing the strategies of how they package and
transmit data.  This unbundling and modular substitutability will extend
down into the local information loop, indeed into the computer itself, as
well, and out into the broadest range of the Datasphere.

	Spread Band Packet Radio, for example, promises to by a kind of
fiber optic bypass.  Spread Band Packet is a kind of digital radio that
makes use of a spread of available frequencies the way that packet switched
networks make use of a choice of available telephone lines.  By doing so
the available and limited radio spectrum is used much more efficiently.
That combined with other digital technologies such as data compression and
the tighter tolerances available with transmission error correction
suggests that computer enhanced radio might provide a kind of 'fiber optic
bypass.'  We might therefore hope to see more and more competition for
local information service, and with it, and undermining of the concept of
local telephone (voice/limited data) service.  Already one Austin
entrepreneur is trying to set up a T1 bypass via radio links.  Another
wants to make use of Austin Cablevision's excess cable channels to promote
his telco/data bypass. As stated above the Austin Independent School
District will be sitting on more capacity and connectivity within a year
than it will know how to use for some time.  MCI announced in January, 1994
that it intends to offer 'local loop' broad band service in conjunction
with existing cablecasting ventures, spending a billion dollars in the
process. The race to unbundle the final mile of the 'natural' monopoly is
on.

	Likewise, deep within the heart of the computer we are finally
witnessing the unbundling and modularization of the very heart of the
computer itself.  The hold of a few design giants like Intel is giving way
to the more open architecture of RISC chips, as well as to Cyrix's open
design.  When Intel itself recently allowed independent software companies
to compile the Intel Pentium (586) chip, it made a long term strategic
mistake if it had wanted to hold onto their near monopoly position on the
PC market.  Compiling for Intel today, a concern can go on to compile for
Motorola tomorrow.  But did the megaliths like Intel and Microsoft have a
choice as the revolutions that they unleashed begin to tear them apart as
well.  Those who ignore the interconnectivity of the revolution will be
bypassed.

	Along with the unbundling of the lines, there is also an unbundling
of the services connected to those lines.  Indeed, we should be seeing an
increasing unbundling local telephone access, all the way to the unbundling
of most telephone services, with the potential for independent contractors
buying dialtone time (so that, perhaps, by buying a service one would hear
the weather or a radio station rather than a static dialtone).  Together we
are witnessing the unbundling of the entire 'telephone system,' as well as
the demise of the theory of the local natural monopoly.  As of 1993, there
were already 46 separately managed CAP (competitive access providers) in 80
U.S. cities and in the top 25 metropolitan offering larger business the
ability to bypass the local telephone companies by means of a dedicated
fiber or coaxial connections to the point of presence (POP) of the long
distance carrier (Huber 1994)

	The geodesic structure is fed not only by an unbundling of
connections, but also of the 'services'  within these connections.  The
services become a kind of content, being perhaps recognizable as a
traditional media product, but also potentially something that is part of
the part of the process of connection.  The distinctions between 'content'
and 'carrier' become harder and harder the fathom.  George Gilder has
written about the issue of 'dark cable' where there is an unbundling of the
provision of the fiber optic line (which he seems to suggest would be
provided by a common carrier), and the source of 'light' at its ends.  He
mentions that much of the impetus for this unbundling has come from large
corporations who want direct access to the optical cables that many of the
telephone companies have.  After an initial experimental provision of this
capability, the regional Bell companies have been denying access.  The
result: companies like EDS have been laying in their own fiber optic cables
to bypass the bottlenecks, setting up their own networks where feasible.
(Gilder, to appear).

	Surprisingly, many of the extant telephone companies appear in a
rush to own large portions of the 'information highway' at what might well
be excessive prices.  Much of the concern about the merger of the cable
giant TCI, and the telephone giant, Bell Atlantic appeared to have followed
upon the logic that one company will be able to buy up the market, or at
least a good chunk of it.  Many of the old critical theorist, seeing an old
giant using its deep coffers to try to corner competition raised the
spectre of a new monopoly, unregulated and untouchable.  They extended the
old "pushbutton fantasy" articulated by critical theorist Vincent Mosco
which suggested that a few powerful companies could buy their way into key
locations, much as telegraph companies might have done in the past, such as
colonialist powers had done, and then dominate all their competitors: using
the electronic revolution to extend domination and control under the guise
of interpersonal liberty.  Using the telegraph model, the electronic
networks were networks of control and surveillance.  Mosco had noted the
economic power and media connections of the giants trying to muscle into
the videotext business, suggesting that they would try to manipulate its
potential, the kinds of things people saw with the same finesse as
newspaper stories had been manipulated over the years.  He might take note
that Rupert Murdoch, currently proprietor of the pan-Asia video service
Star TV, as well as holder of media properties worldwide, had recently
acquired the Delphi Internet services, a national on-line service with some
increasing Internet connectivity.  The argument was not unlike that of
UNESCOs MacBride commission suggested that the growing international
collection and distribution of news would favor a few media conglomerates
who had attained a strategic, international position.

	However, the issue that the movement of the intelligence of the
networks to the periphery of the network, of the decreasing sunk costs to
join, argues against such hard and fast interpretations of centralization.
Electronic media products seem to be following a different tact than that
which supported global economic imperialism of material goods.  Since the
MacBride commission in 1979, not only have more countries around the world
begun to produce their own news and entertainment, and not only have more
alternative, grassroots exchange networks (of audio and video tapes, fax,
computer networks, Xerox technology, desktop publishing, etc.) developed to
undermine censorship (but not repression), but there have been more and
more communication schools and other production oriented facilities
developing to take advantage of these new networks and to address issues of
more local production values (cf. Uncapher 1994).  The videotext giants
have begun to crumble, or at least are losing their hegemonic grip on the
minds, hearts, and keyboards of their subscribers.  If Prodigy had a
billion to spend to create its videotext empire, it had a billion to lose
as well.  Prodigy's deep pockets may well have been not deep enough when it
underestimated the nature and extent of their digital competition.  The
Internet is undermining Prodigy's go-it-alone strategy, and with it, the
kinds of profits that a go-it-alone provider might reap.  Indeed if the
kinds of pushbutton fantasies entertained by members of the left whereby a
central group could disseminate revolutionary calls to empowerment had been
half way true, then indeed the capital heavy corporations like IBM and
Sear's Prodigy might be looming above our monitors even as I write.

	From this perspective then, had the TCI-Bell Atlantic merger
succeeded, the resulting corporation would have been swept into a world of
bypass.  If they had billions to spend, they also had billions to lose.
The AT&T divestiture reflects the instability of any 'connection company'
trying to own all of the links in a world of interconnectivity and bypass.
Take too much profit and then others will try to find routes around the
high priced carrier.  The logic of the market appears to be pressing
towards greater bypass and decentralization, such that even a few billion
dollars won't reverse.  However, the physical connection business, really
the province of transnational corporations, can show tidy profits,
especially in a world that is rapidly privatizing its telecommunications
infrastructure (c.f. Uncapher 1993)  Further the development of the
extraterrestrial infrastructure is still very much in its early stages,
presenting at least the specter temporary market failure to the narrower
band kinds of communication.

	If the telecommunications market is moving in the direction of
developing a variety of transmission pathways, such as spread band packet
radio, it is also moving to develop ever more powerful switches.  Consider
that at this point, one of the key problems with fiber optics is not the
cable, which can now be produced relatively efficiently, but the 'switches'
and 'transmitters/receivers' at their ends.  As electronic computer was a
development from many of the researches and resources of the telephone
company into electronic switches, so we will see the development of
photonic computers, computer that run on light rather than electrons.  The
revolution is fiber optics has often talked about, but not the optical
switches to make these optical networks.  These switches are, in so many
words, preliminary photonic computers.  The market is driving their
development, for they are needed if we are to have a switched optical
network delivering videos and then multimedia virtual reality applications
on demand.  As they are developed in the industry, they will be developed,
or diffused to the consumer.  After all, the photonic computer could
multi-process with a complexity and speed almost inconceivable to
electronic computers, making use of color and interference, not just on/off
opening and closing of digital gates.  Even without the photonic computer,
the box that sits on top of the television is not going to be a means of
retrieving video clips as the cable box metaphor suggests.  Rather it will
become a active device within the network itself, just as any computer with
an Internet IP already is.  With the photonic home computer, we will see
the strengths of the geodesic design growing, and the pre-existing
hierarchy being undermined even more.

	This last point about Internet IP needs to be clarified.  As I have
been suggesting, there is a development towards increased unbundling of the
carriers and their services, and the increasing intelligence of the nodes,
the endpoints.  In the case of the Internet, each machine or node is
responsible for itself, and for the little bit of 'line' near to it (cf.
Chee-Kai 1992).  By keeping that machine relatively open to public
switching, to data stream redirection, then the endpoint becomes part of
the network switching devices, capable of carrying and directing a certain
amount of traffic.  The Gopher menu system on many Internet machines
depends on this distributed switching capability.  Each gopher host machine
can serve both as a menu system with resources, and as a switch to the next
server.  The switch is a computer and the computer is a switch.  One of the
reasons that the cost of the Internet connectivity has been so low is not
that there is someone behind it all who has an immensely large pocketbook,
big daddy NSF (although the public does not always appreciate the billions
that have gone into creating and running the Internet), but because so many
machines are sharing the switching load.  The mess of 'headers' at the top
of many people's email messages often consists of an arcane list of the
different nodes the message went through.  The point of including this list
is to facilitate trouble shooting.  The multiplicity of the links is
implicit in much of the movement through the Internet, even when with
Gopher or Wide World Web, the movement from site to site, computer to
computer, nation to nation, is not even noticed.

	In fact, a set of switches or nodes along the data networks could
be privately or individually controlled, perhaps as initially worked out by
a WAN administrator and then retread for the Internet circuit. One can
imagine a future scenario where the cost parameters are factored into using
a switch (and its value added to the flow) According to MacKie-Mason and
Varian, "The cost to purchase a router capable of managing a T-3 (45 Mbps)
line is approximately $100,000. Assuming another $100,000 for service and
operation costs, and 50-month amortization at a nominal 10% rate yields a
rental cost of about $4900 per month for the router."(1993). One might be
able to buy a few switches and lines in strategic places along the net,
becoming a 'Point of Presence' (POP) and rent the rent out capacity. In a
sense all these smaller work station computers can be considered as part of
the system of routers. The networks might judge the reliability of the
switches, as they do now, and move traffic accordingly.

	What does this mean for policy? Presumably there is a need to
promote open, non-discriminatory access between networks so as to
facilitate market forces regimenting the most efficient networks. Will some
networks practice discriminatory non-connection? The answer is unclear, and
there is a great debate about this point in the networking community. One
camp currently wants to legislate mandatory open standards and
interconnectivity: allow the telephone companies the ability to transmit
video programming, but make sure that they allow others to use their
networks to do the same. This viewpoint is implicit in a number of recently
proposed legislation at both state and federal levels. For example, an
'Open Platform' policy to support universal access to the digital
information infrastructure has been included in the telecommunications bill
recently introduced by Rep. Ed Markey (D-MA), Rep. Jack Fields (R-TX), and
Rep. Rick Boucher (D-VA), having been formulated along the line of Open
Platform principles set forth by the Electronic Frontier Foundation.  As
Vice-President Gore said of this bill at the National Press Club on
December 21, 1993,

  Suppose I want to set up a service that provides 24 hours a day of David
  Letterman reruns... I don't own my own network, so I need to buy access to
  someone else's. I should be able to do so by paying the same rates as my
  neighbor, who wants to broadcast kick-boxing matches...Without provisions
  for open access, the companies that own the networks could use their
  control of the networks to ensure that their customers only have access to
  their programming. We've already seen cases where cable company owners have
  used their monopoly control of their networks to exclude programming that
  competes with their own. (EFF 1993).

As I read the EFF's open platform policy, it goes beyond ensuring
nondiscriminatory access to download or be an information provider at the
'consumer level' but also seeks to promote interconnectivity at all
"levels" in the data networks.

	However, attempts to pin down what this kind of interconnectivity
might be has generally been met with consternation and out and out
disbelief by the connectivity industries.  Consider the legislation
currently (1994) being co-drafted in the State of Washington by Adam Fast.
Fast writes that his legislation:

  * Requires information transport providers to interconnect their networks
  for seamless transmission of voice, video, or data, at compensatory rates.
  * Requires open access at Commercial Points Of Presence (CPOPs) where
  interconnections between local and interlocal networks are made.  
  * Requires network interconnection specifications to be published by the
  Commission in cooperation with information transport providers.  
  * Specifies the process for the location of CPOPs in accordance with the
  Growth Management Act (GMA).
  * Specifies co-location of voice, video, and data network
  interconnections in a single community CPOP.  
  * Specifies that the community served by each CPOP be 10,000 users or
  less.  (Fast 1994)

As should be becoming clear, the kind of topology of mapping CPOPs to
communities the way the World Bank might have once wanted one public pay
phone for every 1000 people in the world is misleading.  Is the community
he is talking about a material neighborhood?  There are many problems with
this proposal.  Let me cite a lucid, angry, extended response to
announcement of this draft by the well known virtual community researcher
Chip Morningstar, project director for Lucusfilm's graphical virtual
environment Habitat.  Morningstar begins by quickly covering many of the
arguments made in my paper so far:

  The Internet in particular and the global telecommunications infrastructure
  in general are expanding at an historically unprecedented rate. Prices are
  plummeting, bandwidth is rising, connectivity is spreading, providers are
  proliferating, access is becoming more and more available to people with an
  increasing diversity of technical capabilities and funding appetites, and
  interoperability is being recognized as a crucial element in nearly every
  major provider's business strategy.  All of these things are good, and are
  happening naturally as a consequence of the natural forces technological
  evolution and the marketplace....

He then goes on to use this background to strike at each of the points of
this legislation draft:

    >* Requires information transport providers to interconnect their
     networks for seamless 
    >transmission of voice, video, or data, at compensatory rates.  
  As an information transport provider, why should I have to interconnect my
  network to *anything*?  What if I am offering a special purpose service
  that only requires a point-to-point link and gets some cost advantage from
  not being more widely interconnected?  Is this going to become a crime?
  What if I am inventing a new communications protocol that I think will save
  the universe but s incompatible with current standards?  Am I to be
  forbidden from selling a service which uses this?

    >* Requires open access at Commercial Points Of Presence (CPOPs) where
    interconnections 
    >between local and interlocal networks are made.  
  Why?  What if I want to provide services to network somebody else's company
  together?  Am I going to be required by law to make this network a public
  environment?

    >* Requires network interconnection specifications to be published by the
    Commission in >cooperation with information transport providers.  
  Why?  Proprietary protocols are now to be banned by law?  Is this even
  constitutional?

    >* Specifies the process for the location of CPOPs in accordance with the
    Growth Management >Act (GMA).  
  I don't know what the Growth Management Act is, but I presume it is a
  Washington state thing.  I'll guess that it's some kind of growth
  inhibition law of the form that seem to be getting more popular here in
  California too.  Setting aside for a moment any quarrels I might have with
  such a thing in general, what can it possibly have to do with
  telecommunications?  In particular, why should I have to get anyone's
  permission to open a POP?  Why should I have to go through a regulatory
  approval process that could take weeks or months instead of just issuing a
  purchase order for my equipment and going into business?  Why should I have
  to subject my business to a regulatory process in which my more established
  competitors can file objections and attempt to throw other legal roadblocks
  in the way of my competing with them (which is how such things always tend
  to end up working)?  What will regulatory inhibitions on my opening a new
  POP do to the currently astronomical growth rate such things are now
  seeing?  I can't imagine it's going to speed things up.

    >* Specifies co-location of voice, video, and data network interconnections
    in a single >community CPOP.  
  Again, why should I have to associate my business and services with
  others'? It might be a good idea, in which case I'll do it anyway, but if
  it's not, why should I be compelled to do so?

    >* Specifies that the community served by each CPOP be 10,000 users or
    less.  
  If my equipment only makes economic sense with 100,000 users, am I not
  allowed to use it?  What if I want to provide a dedicated service for a
  company with, say, 20,000 employees?  And why 10,000?  Why not 5,000 or
  15,000?  This kind of arbitrary restriction benefits almost nobody.

	I hope I can be excused this extended nature of this citation but
this remarkable exchange serves to cogently situate issues of policy and
practice at the frontiers of cyberspace.  Morningstar's perspective is
eerily reminiscent of some of Mark Fowler's deregulatory logic as head of
the FCC under Reagan.  However, while Fowler made the dubious assumption of
the ends of media scarcity in the 1980s, Morningstar has better claim to
potential for success in eliminating connectivity scarcity by competitive
bypass.  However, might access be used in a discriminatory manner?  Who
should we mandate should be connected to whom?  Will those who do not list
their connectivity specification gain a market advantage or will they be
bypassed?  It is one thing to say that it would not be in the interest of
profit to deny access to some party, group, or corporation; another to make
sure that this kind of discrimination does not occur.  These debate goes to
the heart of the question of redefining the common carrier in the era of
the geodesic information network.  How are we to distinguish content and
carrier.  That distinction of content and carrier is undermined by
something as simple as a gopher menu, which maintains content and provides
connectivity (I will return to the complexities and immense importance of
'menuing' in a moment).

	As Huber noted, telecommunications policy is having immense
difficulty surveying and keeping track of all the current innovations.  The
logic of the networks dictates that: whomever does not connect will be
bypassed.  Bypassed by those who do want to connect.  This might be fine in
theory, but in so far as there are limits to the number of wires that might
be sent to house, and the spread band radio technology is still
undeveloped, should we not follow Vice President Gore when he believes that
we need to mandate that if the telephone companies are to provide any kind
of programming, they still need to provide access to whomever wants to send
or receive information from a home or business. If the telephone companies
to be allowed to carry video, should they not be required to carry anyone's
video feed as well, in so far as it is technically possible?  (Gore:
1/12/94)  Sometimes we need to make a distinction between long term
inevitabilities and short term problems.  Hence the debate.

	2.  Content Provision Putting it on-line: 
                   Access and Openness

	I will separate the issue of access into two domains: the provision
of materials to the networks and the availability of getting that
information off from the networks, concentrating on the former in this
section.  Let me begin with the issue of open government.  In this section
I will argue that we need to get as much government information as possible
on line as quickly as possible.  We need to make 'cyberspace much more
informational rich than it currently is, and a much more reliable carrier
of information.  If some cybernauts complain about too much information, I
complain that there is not enough.  While the government has been
collecting data on individuals, corporations, and other governments,
individuals should be able to get access to such records as basic
legislative records, and so on.  In many respects this issue is the most
complicated one of this paper from a policy perspective because the policy
of openness and access needs to be balanced with privacy and with the
government need for tariffs to run its operations, including its collection
of information.  I have detailed elsewhere the development of central
government database, such as the Treasury Department's FINCEN computer
which seeks to keep track of all financial transactions so as to thwart
criminal transactions, and the civil liberty and privacy issues that such
massive databases raise (Uncapher 1991). I might point out that something
of the ominous nature of these databases would be mitigated by increased
government openness.  One of the important components of danger of such
databases is their panoptic nature: they can see us but we can't see them.
This kind of unbalance needs to be addressed.

	The issue of the kinds of information that the government should
collect or disseminate goes to the core issue of how are we design our
government.  The Reagan-Bush administrations drew attention to the notion
that government is often in competition with private industry in the
collection and dissemination of information, and suggested that private
industry held accountable by market forces should prevail in this new
world.  Political economists in turn have often noted just how often public
research and data collection gets privatized just before it reaches the
public, so that the public has to pay for research and data collection it
had already subsidized (e.g. Mosco and Wasco 1991).  Examples of this
include public funding for space research and remote sensing, for the
collection of business data, and for the publicly funded development of
pharmaceuticals and materials.  Often this data only reaches the public by
private channels: 'subsidize the loses and privatize the profits.'  It
would seem that if the information has been generated by public funds, and
is in the government information warehouses, our warehouses as citizens,
then we should access to them.  Increasingly, we need to open these
channels up, so that if they are subsidized, and if they do not compromise
privacy interests, then they should be in the public domain.  And this
should be a government priority.  This kind of access if gradually
occurring, as the public is about to have on-line access to the Security
and Exchange Commission (SEC) filings.

	The benefits of providing such access are multifold and lie beyond
the contexts of this paper to elaborate.  In brief, we should note that not
only do we get a more informed public getting access to the tools that in
wants and needs, but we distribute the burden of the government services
more.  According to John Pavlik, when the Department of Consumer Affairs in
New York City set up an audio text system to answer frequently asked
questions, such as business hours, departmental functions, and to provide
information such as various regulations, they found that callers received a
busy signal 30% of the time compared to 50% before the system was
implemented (Pavlik 1994:151). How then might an on-line version of this
complement the provision of public information more fully?

	Much of the kinds of information a community network might want to
distribute is in these public records, and yet the public has had
difficulty in obtaining it.  For example, the Texas Public Register, which
'publishes' the doings of the Texas Legislature was on-line for a mere
matter of days before it was taken off-line again by direction of the Texas
Department of State.  They ostensibly argued that they needed to study how
to generate revenue from access to such records, and whether charging for
access to their records might be a suitable direction to take.  Yet these
records are one of the most important sources of the doings of the Texas
government.  This is precisely among the information that Madison and the
other Federalists felt that citizen would need in order to be their own
rulers.  In California, a bill is currently pending (AB2547) that would
promote public electronic access to public records if the agency maintains
the records in electronic format. Other activist hope to re-negotiate the
contracts that allow only one (private) company, Counterpoint to
re-distribute the Federal Register itself!

	For now I would argue that the issue of recouping costs should be
secondary to getting the information on-line.  This means making sure the
funds to set up basic servers, to fix problems, to set standards, to keep
up with innovation in access technology need to be addressed.  When the
public is more fully involved with on-line environment, then collectively
we should address the issue of the extent to which and how costs associated
with the provision of this information should be raised.  For now, we
should consider this access as something that our tax dollars have already
paid for.  Simply providing information for public for is not enough if we
do not set out at the same time to disseminate the skills of how to access
and exchange information, or the nature of one's information rights and
responsibilities.

     		 3.  Finding it On-line: Access and Skills

	One of the key issues to organizing public information in the
electronic environment is to provide basic skills of information access.
Without these skills all the information in the world will not bridge the
gap between the 'information rich' and the 'information poor.'' If we once
spoke of universal service in terms of making a telephone available to
every household, now we might speak of universal service in terms of
bestowing a basic set of skills about how to acquire and use information.
It is one of the ironies of the information age that there are so many
people who do not know how to get access to the basic information about
their rights, about health issues, and so on.  Data without the skills to
turn it into knowledge is simply noise.  As a teacher I find many students
are on the one hand comfortable using electronic devices yet on the other
unaware of their potential.  Many seem to feel that the 'machines will do
the work for them' rather seeing the machines as simply extending their
investigative natures.

	As a community activist involved in helping establish a local
public information service, I often hear reference to the issue of the
political economy of skills.  If we are to put all the effort into getting
a community network together and into facilitating new kinds of
interactions with government and other citizens, will we really be reaching
broadly enough to people who could really need this information?  This
topic is now receiving a great deal of attention.  William Dutton writes
that we must always keep in mind that "technology is not simply the
equipment, but also the know how and expertise involved in using the
technology.  In this respect, there is a clear need for greater access to
expertise and technical assistance in computer and telecommunications"
(Dutton 1994:130).

	At the same time, we must realize that if the information on-line
is second rate, or could be gotten more easily somewhere else, then
learning how to use a computer or telecommunications will be of little
relevance.  Too many scholars seem to be falling prey these days to the
'technological mythos' which suggest that technology by itself is going to
solve our social problems.  The problems for minority businesses is not
that they do not know how to use computers or telecommunications, but that
they need to have access to capital, and to pre-existing business networks
without prejudice.  I helped teach non-profit groups about computers and
telecommunication during the late 1980s and early 1990s and found that
while there were some desire to learn about better, inexpensive accounting
programs, the kinds of information that might be useful to them on-line was
nil.  To then conclude, as do a number policy writers, that non-profits and
minorities (however defined) will fall behind, becoming part of the
information have-nots because they have not traditionally made use of
telecommunications is simply misguided, if not demeaning.

	In Santa Monica, when access became more public, and debates more
public, and the information available for free or minimal cost became more
diverse, the public became more interested, and acquired the skills.
Homeless people on the publicly supported Santa Monica system were among
the first to make use of the community wide system, getting the public to
understand their plight and to attempt to find some way out.  The issue was
not to get people to go out to homeless people and get them attend classes
about how their lives might be better with better information, but to
provide the services and resources (such as a community wide audience),
that encouraged many to participate.

	Dave Hughes wrote that trying to build a national "information
highway" without teaching people the basic skills of information access is
"as if the US had started to build an Interstate Highway system before many
American knew how to drive" (1993).  At the same time teaching people how
to drive if there is no place to drive, nor highway for them to drive on is
likewise a waste of time.  Here the metaphor of the highway with its
assumption of hierarchies causes a misunderstanding.  The highways and the
local roads are being created at the same time.  Certainly there are
differences in line capacity, and federal money is being spent to research
projects like HPCC mentioned above.  But the 'highway' is more than lines
and links, it is also points of departure and destinations.  When I was
helping to organize the PennNet information network at the University of
Pennsylvania, I discovered that we had to provide services and real, useful
information on-line to seed enough interest so that the many academic
departments, hidden in their feudal LANs and individual cultures would be
willing to pay the connect costs to join PennNet, rather hopping directly
onto the Internet.  With a few good services on-line, I and the chairman
argued, almost alone against the technophiles, that the first departments
would join and the synergy of the local system would grow.

	Similarly, as more popular and useful information gets on-line,
then more people will want to get on-line with it.  Dave Hughes, in his
capacity as helping to establish the Big Sky Telegraph has long made this
point: services pull rather than technology push.  Indeed, he was one of
the people who taught me this point.  When the teachers in rural isolated
communities realized that they could get access to other teachers to share
lesson plans and advice, to make friends, and to provide new educational
directions for their students and community, they were very interested, and
many made a difficult leap to becoming tele-literate.  The provision of
centralized farm and market information to ranchers in the same area a few
years before had been a failure, not because the ranchers lacked the skills
to use the equipment, but because they already had an efficient network of
information available through friends, magazines, and so forth that did the
same things that the new system was designed to accomplish.

	That said, as we begin to put more public information on-line, we
need to determine how we are to best teach the skills necessary to access
that information.  Should it be through libraries, through schools, etc.
As with the network itself, there is bound to be redundancies; some will
make use of the libraries if they provide information; some will read
books, some will discover new ideas in the newspapers, some from friends.
We need to anticipate these needs in the budgets of our libraries, and
prepare our students well.  However, I think that we need to be foremost
activist for getting valuable information on-line so that people will want
to get on-line.  If there are things of interest to be found there, then I
will wager that the public media will each want to be the first to tell the
public what's there.

	In providing skills we need to make sure that we are not training
people to simply be consumers of information, rather than also its
providers and organizers.  Who will create the new videos, bulletin boards,
etc.  Wouldn't newspaper articles simply concentrate on the consumerist
aspect?  In answer to this query, we need to consider two factors.  As has
been pointed out repeatedly, to the surprise of many network engineers
much, if not over 50% of the on-line traffic is interactive and social.
This fact played havoc with Prodigy's formula for economic profitability.
Their access topology assumed that people would want to access services
such as airline flight information, tickets, recent news, and so forth.  In
fact, the use of email on Prodigy took off, as users wanted to meet other
users; they could use their old travel agents more easily, could read the
paper in more than a screenful somewhere else just as easily, if not more
easily, and without having to be bugged by Prodigy's constant barrage of
advertisements.  This kind of email traffic overwhelmed the economics of
Prodigy's star hub topology.  When they quickly sought to recoup their
losses by dampening email usage, and using near dictatorial powers to make
sure that no commercial hints or information was transmitted on their
system other than what had been officially bought, Prodigy found that users
were quickly dissatisfied.

	Similarly, the French Minitel system was initially designed simply
as a kind of videotext retrieval systems, the kind of videotext system
envisioned perhaps by Vincent Mosco in his Pushbutton Fantasies.  However,
on-line chatting of all sorts quickly became the dominant on-line activity
on the Minitel, followed by information retrieval.  The conversation load
grew so quickly that the Teletel backbone actually collapsed, leaving many
French telephone users without telephones for a day.  The moral of these
anecdotes appears to be that one of the things that on-line access provides
that off-line access (to a variety of competing media) is extended
sociability.  In other words we may find that 'consumers' out there want to
be more than passive consumers and do not need to be constantly encouraged
to become information providers.  Perhaps rhetoric about of consumer
passivity was a product of Industrialism and Mass Media, when the 'mass'
could not get access to the means of distribution of all that information.
This is rapidly changing.  Indeed, is it little wonder that the revenues of
the electronic games industry has already surpassed the film industry in
the United States?

	Secondly, another factor promoting on-line activity and engagement
comes with the companionship of a host of other people asking for
information and providing it.  We should bear in mind that once a basic
skill levels has been achieved to gain access to the collective on-line
world, that world will in turn provide resources about how to become more
proficient about the networks, such as how to set up one's own gopher
system (should one want to).  Part of the experience of being on-line is
not simply 'getting information' from some slickly designed interface, but
in simply asking questions, or asking people where the right place to ask
question.  The on-line world is, as many have pointed out, a living
database.  With the expansion of what I have elsewhere called the global
grassroots infrastructure (Uncapher 1994) we should begin to see a variety
of grassroots empowerment groups seeking to provide access to individuals
of empowering information, finding ways in the geodesic universe to
circumvent restriction and to understand and fight oppression.

   	     4.  Bundling all the World back Together: Interface

	If the Datasphere is becoming a sea of potential connections, then
we will need tools with which to navigate those connections.  The tools
that facilitate navigation might be generally known as interface, serving
to bring together a disparate collection of intentions and presentations.
As we change the way we navigate the datasphere, or move into cyberspace,
then the skills will also change.  There is a politics to the interface in
the sense it is at the level of the interface that networks and nodes can
become hard or easy to use.  Bad interfaces repel and frustrate people and
lose information.  Some of the frustratingly poorly designed Internet
interfaces found on old mainframe computers and retooled for modern
Internet use (such as the VMS), stem from the era when mainframe computers
where the domain of a few data priests who hoarded their knowledge about
how the system worked, and held onto their power as the necessary
intermediaries between the machine and the desired outcome.  This is
gradually changing, and we can anticipate better interfaces using a variety
of input devices and flexible, forgiving data tools, such as 'intelligent
interfaces' that make use of an artificial intelligence, fuzzy logic, or
expert neural network system that anticipates our needs and proclivities.
The issue of interface involves much more than this, however.

 			Distributive Logics

	Consider that much of the information and debates on a community
network might be distributed across a number of nodes, in dozens of
different centers of activity.  It is up to the interface to collect them
all together in a way that is useful and pleasing.  As I stated in my
introduction, one of the stumbling blocks for the organization of a central
community information system such as PEN in Santa Monica has been trying to
decide just how centralized such a system needs to be.  Since resources can
be distributed, who or what should be responsible for organizing them?  In
the Santa Monica system, the municipal government for $200,000 has funded a
central computer, provided public access terminals in a variety of
locations, given access to documents from city hall including schedules,
reports, and proposals and other public buildings, and facilitated a
variety of interactions and services including email, a common 'community
center', and different conferences, reaching, however, less than 5% of the
population (cf. VanTassell 1994; Dutton 1994). However, other communities
such as Blacksburg, Va. a community that does not have the same resources
as Santa Monica, opted to set up a gopher client, a menu system that
includes some information specifically related to Blacksburg, and set up
links to other systems for information that might be useful.  Whereas Santa
Monica uses a centralized, magnet model, Blacksburg uses a more
decentralized, but open system.  Both models have their selling points and
drawbacks.  In fact, both systems might be seen as quite similar, varying
on the nature and extent of their openness.

	Santa Monica's Caucus II conferencing software provides the ability
to engage in extended conversations, to get access to several databases,
and I believe to provide some files of one's own.  Blacksburg Electronic
Village does not have very much conversation: it seems like a silent
gateway to a massive conversation and information exchange happening
elsewhere.  However, the two systems are related.  We must first consider
the more general environment of other virtual cafe's and clubhouses, both
in Santa Monica and Virginia.  Bulletin board systems exist in an
environment of alternative boards.  A neighboring bulletin board systems
might also provide free Internet access, local discussion, FidoNet mail
feeds, and so on which are not available on the Santa Monica's PEN system.
Some PEN users have complained about such annoyances as 'net bozo flaming,'
when a single users continues to hold forth.  Where do these neighboring
bulletin boards fit in?  What if the alternative virtual cafe and town hall
could offer me greater storage space, custom, more usable access software,
broader dimensions of connectivity, and perhaps more specialized submenus.
Should the Santa Monica's system seek to act as a gateway to these systems?
In comparison, the Blacksburg Electronic Village is simply a shell, more a
node as connector rather than as destination.  However, any of the other
regional Virginia systems that have developed some kind of Internet access
can include the 'Village' as part of their offerings.  We need to now ask
how this might be so, and pursue a bit further its implications.

	The Gopher software which organizes the offerings of the Blacksburg
Electronic Village provides something of a prototype for a whole system of
'client' software. The idea of a Gopher system is to provide a menu on
which one can find listed: subdirectories , files, resource programs, or
links to systems that might be off the gopher.  "It lets you browse for
resources using menus" (Kroll 1992:190).  The surprising thing is that a
subdirectory that might appear to be just another subdirectory might be
half a world away.  That is, you can add someone else's gopher menu as a
subdirectory to your gopher system.  The recursive possibilities become
immense.  For example, Prof. Anne Bishop of the University of Illinois
recently asked me for a research report I had written as background for her
own study of the Blacksburg Gopher.  I was able to find my report already
available on the Blacksburg Electronic Community itself, on a submenu of
topics related to community networking not far from its root menu at the
Blacksbugh Gopher.  In actuality my paper 'existed' on a gopher menu
provided by the WELL in the San Francisco Bay area.  The two menus had been
linked to one another.  The Gopher 'client' program at Blacksburg had
automatically made a link to the 'server' in the Bay area.  For the
Blacksburg user, however, that need not matter: my paper was only a
subdirectory away.  The link between systems was hidden.

	Gopher software is still in a rudimentary stage.  It tends to be a
line oriented text and image retrieval device unless one is running it with
a specially constructed interface using some kind of TCP/IP connection to
make it work. One does not use it to interact in newsgroups, interactive
email, etc.  Rather one uses gopher as a way to manage the pathways to
where such activities might be engaged in.  One can imagine a better
linkage between the 'read news' interface (or one of its variants) that
provides interactive access to the global UseNet discussion, that global
bulletin board, and a gopher like menu system.  There needs to be a better
way of avoiding transport bottlenecks, trying to make use of an overly busy
link when others are available: the links need to become more dynamic.
There needs to be ways to designate degrees of access to information the
way UNIX and other such multi-user operating systems permit a variety of
privilege levels.  What are the implications of these multi-machine linking
interfaces for public policy?

	There is a kind of competition going on in providing better, more
integrated menus.  By integrated I mean that the same activities (going up
or down a menu) will be done with the same key strokes, mouse clicks,
kinetic gestures, or vocal commands in a variety of otherwise different
contexts.  If I want to go up a menu, I should be able to type 'u' whether
I am in gopher, in a mailer program, a news reader, or whatever.  The goal
is provide a flexible, integrated, seamless menu that acts just like any
other drive or application on one's home computer. The home is becoming
more integrated while the highway is becoming more geodesic. For example,
Engage Communication, Inc. provides a 'one-step connection and file
transfer solution' which send files around the global WAN by having a user
simply drop the icon of a file onto another destination icon found in the
transfer directory.  The computer automatically makes the connection
(perhaps one is already on-line, thus speeding up with its own line to
speed up the movement).  More generally, using a direct 'Slip' connection
which links even a PC or a Macintosh directly to the Internet, one can
already treat gopher (or one of the World Wide Web clients) as just another
directory on one's Windowing program.  That file on any subdirectory might
be located on your own hard drive, or it might be somewhere else.  If there
is a cost associated with accessing 'one of your computers directories'
then that might be noted, and the needed file might be transferred, if
possible to a directory without such an associated cost.  And as always, it
is wise to back up the files that you most need on your own
disks/tapes/flopticals since you can lose all the data.  Likewise what
happens when the remote server discontinues a service?

        The Fragmenting idea of a 'National Information Service'

	Some individuals have argued for a 'national public information
service' (cf. Williams & Pavlik 1994:69-101), whereas others want to see a
more local municipal service with more local services (Williams 1994:78;
Hadden 1993).  To many of these writers a national public information
service would distribute perhaps of congressional records, supreme court
opinions, and executive branch records and proposals, basic health
information, social service information, such as information on health care
providers and services, basic business information such as the explanation
of import/export restrictions and possibilities, the dissemination of
information about financial markets, including the public records of the
companies in those markets, and information about collective issues.  The
local, the national, and the transnational can coexist on a more
hypertextual, dynamically linked, gopher like system.  The menu in a sense
collects this information into a usable bundle.  This bundle is not
completely preset by the service giving the information, but can be
redesigned by intermediaries along the way.  I might log onto a local
bulletin board running on a used 386 computer that is itself running a slip
connection to the Internet.  If I wanted to find some information related
to the rights of renters and landlords, I might activate a search of
documents with 'landlord' as a keyword, designating if I want the search to
be local, regional, national, or in terms of some other specially defined
parameter.  The flexibility of having an open system is that as different
information providers came on-line, they could be integrated into the menu
in front of me.

	The importance of dynamic menus in the geodesic information
structure cannot be over-emphasized.  Some of the larger telephone
companies still appear to image a world of giant gateway services where
they might be able to offer a menu to a variety of services, and then
generate revenue from the transaction based on some rate structure, such as
a percentage of the transaction, or in terms of cost per connect time.
This might be so, but if their menus appear to be too expensive, or their
rates too high, then users will be able to navigate around them.  Whereas
the older video bundling services, such as the television networks and
cablecasters might dream of providing their own powerful menus, there is
little to stop producers from simply setting up a kind of point of presence
in the datasphere, and let the menu gleaners distribute the product.
Perhaps some producers will want to help underwrite their expenses by
offering some kind of initial pathway exclusive.  Virtual property,
'netcash,' and ownership, which will be available to some extent by virtue
of cryptography and external penal sanctions can potentially work to favor
some parthways, and some menus.  Yet it should prove difficult for content
to stay on its designated pathways.  We should be able to look to
individuals purveying the best menus at the cheapest price.  That said, we
must bear in mind that the 'node as menu' is not simply a destination
(although it might be), but also a transit point.  With the amount of
information, services, conferences, programming, and ideas available
on-line growing by leaps and bounds, the menu, or more accurately, an
organized selection of material with a user friendly interface will grow in
importance in rebundling and providing access.

                  The Object of Search is not an Object

	Further, we need to understand that in the geodesic information
infrastructure the objects retrieved are changing identity and format as
well.  Documents and virtual objects are becoming menu-like, becoming
active documents on-line rather than the passive one published on paper or
handwritten.  Books transposed themselves in the electronic environment are
becoming more like menus.  Writing reflects the collecting and organization
of information, a fact attested to by bibliographies and footnotes.
Virtual books such as might be found on the World Wide Web are becoming
more like hypertext documents, where links to other documents are included
in the document itself.  Hypertext theoreticians have long realized that a
book transposed into an electronic environment was not simply an
electrified book, accessible at the flip of a switch.  Rather a book became
more like a knot of ideas, perhaps with an ideal reading, but one which at
the same time could be read along many different pathways.  Ted Nelson
envisioned a hypertext world in which any phrase might be linked to any
other, facilitating our exploration of the background, context, or
correlative material of an idea, or even a fragment of an idea (Nelson
1972).  Nelson called the final book with all its hypertexts links
available Xanadu after Coleridge's stately, mystical world where Kublai
Khan built his court and surveyed the world.  I was recently asked by a
Austin computer manufacturer if I might put the final version of the paper
you are reading in World Wide Web HTML (Hyper-Text Markup Language) format
so that it might become 'hypertextable' and useful on his Austin theme
World Wide Web node. One professor at the University of Texas at Austin
assigned his architectural students the term long task of
creating/assembling hypertext books for a virtual architectural school.  An
image, object, word, or text fragment might lead to another document.  The
kinds of documents that will be of importance to the community network are
rapidly changing form as the networks become more geodesic.

	Federal funding is already beginning to go to developing new
menuing and interfacial programs.  The NSF is about to fund a number of
research efforts into providing better menus to interconnect computer
servers (NSF 1993).  This makes sense since the government itself will find
that it too needs to get access to its own records which will be available
through a variety of networks.  The kinds of menus that are developing can
include access to a 'local' community discussion group available at initial
login, with other pathways into the far reaches of the datasphere also
available.  This kind of topology is already available on a system like the
WELL.  There a local user could login from the San Francisco area, and
simply access the conferences.  Since this is a subscription service, the
kinds of discussion could be different than those found in the general
datasphere, on places like the Internet.  The WELL is still only a stopping
point between worlds.  Perhaps one might add a 'front end' program to one's
home computer to create menus that would help navigate the WELL's rather
arcane, command driven interface (otherwise one must type r to read, type b
for browse to see a list of topics in the conferences, type ?conf to see
all the available conferences, etc.).  However, once connected a user might
then shell to the deeper Unix level to use gopher, telnet, ftp out from the
WELL.  In a way, we then have a local conferencing system working with a
variety of connectivity possibilities (for a fee).  Or else one can reverse
the process and telnet or remotely log into one's account on the WELL from
the Internet.

	Models for organizing and distributing information are still being
developed.  Tom Grundberg, founder of the Cleveland Free-Net, and head of
the National Public Telecomputing Network (NPTN) has proposed and begun
work developing the idea of a 'Corporation for Public Cybercasting' based
on the example of the Corporation for Public Broadcasting (Grundberg 1993).
Member Freenets (which use NPTN's proprietary FreePort conferencing
software) could subscribe to different member services, while providing
their own local 'programming' as well.  Some of the 'distributed projects'
that are currently being shared include 'kid trek' for young science
fiction writers, the 'NPTN student news network' and the 'student art
network' to encourage students to 'publish' their two dimensional works
(NPTN 1992).  Implicit in this strategy for cost sharing is the final
dimension of the changing environment community networks, and one that has
been in the background of much of this discussion: cost, the cost of being
on-line, and of getting on-line.

    5. Developing Appropriate Service in an Environment of Changing Costs

	An article by John Markoff on the November 3, 1993 front page of
the New York Times spoke of "Traffic Jams on the Information Highway."
While many new users to the Internet continue to speak loudly and happily
of all the things that can be found on it, there is still little awareness
of the ever increasing demand the Internet is managing.  As Markoff puts
it, "Call it a cautionary tale for the information age: the nation's
increasingly popular data highway is beginning to groan under the load of
rush-hour traffic."  Requests for the World Wide Web service, again, the
more multi-media version of the popular Gopher menu service, exploded from
100,000 per day in June to 400,000 per day in October, 1993.  According the
administrators who have the responsibility for handling these requests
suggest that the only solution to increasing demand may be "to take a $15
million super computer away from its normal scientific number-crunching
duties and employ it full time as an electronic librarian" (NYT 11/3/93;
A-1).  On the services side, another example should prove illustrative:
MSED, Inc., a free on-line career counseling service based in Ann Arbor,
Michigan saw the demand for its services rise from 1000 inquiries per day
in June, 1993 when it opened to 12,000 inquiries per day at the end of
October.  Overall, the Internet traffic has been increasing at 20% per
month (Sterling 1993), which as a compounded interest rate, means Internet
usage, either in terms of users or data transfers (the data were unclear)
is doubling in less than a year.  Each month brings with it new books on
the Internet, and new journals for popular use, such as Internet World, are
vying with more research oriented journals for shelf space.  It would be
naive at this point to think that this kind of connectivity to these kind
of services can continue without a change in the pricing structure.

	The issue of cost is therefore deeper than simply allowing for the
'commercialization' of the Internet, that is of allowing businesses to
provide services along side a new collection of government, research, and
private usage (cf. Locke 1993).  While many newcomers to the Internet seem
much taken by the services it offers, and by the amount of learning it
takes to master the strangely designed services, this should not blind us
to the complexities of the emerging Internet structure.  Developing a price
structure, some claim, would help to direct this torrential flow.
According to Kleinrock (1992):

  One of the least understood aspects of today's networking technology is
  that of network control, which entails congestion control, routing control,
  and bandwidth access and allocation.''  We expect that if access to
  Internet bandwidth continues to be provided at a zero cost there will
  inevitably be congestion.  Essentially, this is the classic problem of the
  commons: unless the congestion externality is priced, there will inevitably
  be inefficient use of the common resource. As long as users face a zero
  price for access, they will continue to overgraze.'' (1992)

	There has in turn been considerable debate in Internet circles
recently about what the eventual rate structure might look like.  Certainly
one might want to charge for 'connect time' or for 'bytes transferred' or
some such combination. However, the actually accounting procedure for this
kind of pricing has yet to be determined.  The problem is that the topology
and nature of the flows do not lend themselves to easy metering.  After
all, much of the switching is being done by activities embedded into the
network itself.  To introduce a barrier to the flow to establish the
measure might be more bother than it is worth.  The analogy might be to the
development of the early postal system.  During the early development of
the postal system following the example of the Fugger newsletter service,
the sender (or receiver) would have pay according to the distance covered,
the weight of the package, and so on.  This kind of price structure was
eventually replaced by a flat rate structure when the British postal
economists determined in the early 19th century that it cost more to
determine the distance, and administer the charge than it did to simply
charge everyone a flat fee, no matter whether the package was going 5
blocks or 500 miles.  The administrative costs proved to be more than the
services was worth.  The same has been true of measuring costs on the
geodesic information infrastructure.

	The economics of the networks are therefor difficult to determine
with the simple concepts of 'highway' and 'turnpike.'  Cost analyses should
prove to be in flux in the coming years.  Some parts of the networks will
prove quite expensive to join, while other sections will be virtually free
for vast sections. Until we have a broadband network to all our schools,
public institutions, and home, community network need to consider the
appropriate connectivity, in terms of responsiveness, reliability, or
'throughput.'  Should the lines to the Internet carrier by 2400, 9600,
28.8, T1, T3 or what?  Currently, the higher the speeds, the higher the
associated costs.  Who should run the system?  Should it be all volunteer,
a turn key system developed and run by outsiders.  What other costs, such
as liability insurance need to be considered?  To begin with, the geodesic
nature of the system suggests that we might have many local experiments,
and that trying to develop one single system could well be increasingly a
difficult proposition.  Rather we will see local systems bound together in
increasingly innovative ways.  Each node along the way will have costs
associated with its particular size, connectivity, etc. and these will
vary, whether one is Santa Monica PEN, Big Sky Telegraph, or a much smaller
system.

	The degree of connectivity is a key here.  Direct Internet
connectivity means opening one's site to be available to the rest of the
Internet 24 hours a day.  This degree of connectivity can facilitate
gopher, telnet, finger, etc. at high speeds.  Yet this is only one kind of
connectivity that is valuable to the community network.  In geodesic
fashion where a node as endpoint becomes a transit point, a local node
might tag onto another's Internet access, using the intermediary computer's
excess capacity.  Such remote nodes may or may not have an Internet address
of some sort.  In Austin, for example, several bulletin boards make use of
'Slip connections' to get a relatively direct access to the Internet.
Generally, Slip connections provide temporary but full access to the
Internet.  A user on such a system might well have a personal Internet
email ID, but the machine or node itself might not be 'available.'
Narrowing yet in connectivity scope but not in importance, many bulletin
boards around the world are providing 'store and forward' capabilities to
the Internet though such protocols as UUCP.  Essentially, mail is exchanged
with the Internet, at a frequency determined by the local board in
accordance with its costs and capabilities.  Such 'store and forward'
capabilities allows the local board access to Internet email, Mail lists,
UseNet news groups, file transfers, and to a variety of information request
systems.  Already a number of bulletin boards in the Austin area are
offering free Internet email addresses.

	This final tier of access has great potential for community network
development.  As Frank Odasz has said, the reach of the Internet is already
available to any community, although it might be too costly at this point
to provide full time Internet connectivity:

  But, Internet access is not a black and white issue, there are different
  levels of access, and benefit, that challenge many of the prevailing
  assumptions about the cost/benefit ratio. For many Internet users, the key
  power of the Internet is the connectivity with other minds; the Internet as
  a community of communities. Communications with 10 million Internet users,
  with 100% reliability and convenience, is possible WITHOUT full Internet
  access. Internet messages stored on a local community bulletin board system
  (bbs,) for nightly transfer via high speed modems, can bring email benefits
  virtually identical to expensive full Internet access. (Odasz 1993)

When Odasz's Big Sky Telegraph received grant funds from US West to expand,
it did so not by becoming a more complex and larger system itself, but by
strengthening and decentralizing his outreach program.  Since telephone
tariffs are still high in rural areas, especially intra-state rates, Odasz
worked to create a network of Fidonet nodes around the state (6 are
currently in operation), that would bundle the messages, files and requests
together.  Then at a designated mail hour, the various remote sites would
connect with each other and the central Big Sky Telegraph system in Dillon,
Mt.  Since the Telegraph has also become a gateway to and from the
Internet, these remote systems could now provide their users with many
Internet resources at minimal cost.  As Odasz summarizes the situation:

  Logistically, even with full Internet connectivity, we must wait for our
  mail to be read and answered. For the purposes of building global
  communities of learning, or trading, based on interacting regularly with
  experts, the REACH of the global Internet is well within hand for ANY
  community member on a shoestring budget. A community's choice of twelve
  hundred Internet discussion groups can be "echoed" on local bbses with
  great economy. Newsletters and listservs on rural and community development
  are already being shared worldwide... while the IMMEDIATE interactivity is
  NOT present, well targeted searches can often result in the needed
  information within a 24 hour period, or less. This is true for FTP ordering
  of files and the use of Internet mail to automatically search many
  different forms of databases. (Waismail, gophermail, ftpmail, and more.)
  (Odasz 1993)

	If the idea of universal electronic access is to minimally have an
Internet email account, and access to a minimal amount of services, then
such access is in fact already becoming available universally.  If it is to
be something more, then what?  If universal service suggests that we also
include a computer of some sort, then what kind?  The terminals that were
provided by the French government as part of their Minitel (Teletel)
nationwide videotext system did not have the capability to store or
manipulate data.  While some have suggested that we implement a policy of
tax credits for 'terminal' acquisition (Dordick & Lehman 1994), it is
unclear to me what a terminal is.  Is it the computer, the screen, the
modem?  Looking longer term, rather than a 'box that sits on the television
set that connects us to the Internet' we should be seeing a kind of
terminal that establishes the home as a site on the Internet with its own
Internet address.  The issue of defining a terminal and its standards
returns us to issue raised earlier in the paper about the question of
determining the extent to which the government is capable of properly
restricting market growth in one direction (new standards, more flexible
interconnectivity) in favor of developing other areas ('universal
terminals').

               		    Conclusion

	This has not been the place to discuss the ethics and implications
of these media revolution, particularly on the development of communities,
identity, and the construction of power, but rather has been one to provide
the beginnings of a conceptual vocabulary with which to assess these
changes.  For too long we have been bound to the focus of mass media or
interpersonal communications, as if these two 'poles' could somehow
implicitly map out the mediascape.  The ongoing transformations in the
mediascape, and more generally in the global/local cultural flowscape must
disconcert the practitioners of the older schools of communication
research.  The problem is not that these methods do not do what they are
supposed to do, but rather they leave unremarked something which, if we
could say, would be remarkable indeed.  The use of geodesic concepts should
be considered as only one elements in a new exploration of multiply
mediated interpersonal interactions, interactions which seem to change and
reinvent themselves so fast as to embark on new ordering principles
unworked out by first order network theory.  This is the domain of
mid-range theories.

	At the center of these considerations is the community, and the
kinds of identities, both of things and peoples, that communities both
create and reflect.  How are we to think, for example, of the integration
between communities, or worlds where we might know many people, but not our
'neighbors.'  This integration is becoming not simply social, but
technological. Interface serves to connect and to displace; foregrounding
some decisions, and assuming the answer for some other decisions.  This
mid-range is a region of not quite, not quite.  It is a region somewhere
between integration and fragmentation.  One of the key things that
communities do, other than reproduce themselves and their sub-domains, is
to make sense of complexity, is to create flexible, playful ways of dealing
with a constantly changing environment, and to break down its changes into
usable size.  At the same time, communities, and their individuals exist in
interaction, at whatever the scale we are exploring, and the change within
a community changes the way that community deals with its environment as a
whole, even if that part is 'unaware' of the consequences.  This is true
whether the community is a human body or a body of humans.

	In developing community network, we bear in mind the increasingly
geodesic nature of the information infrastructure, and the interdependence
of the issues of the bypass, information provision, especially of public
information, a political economy of skills, interface, and the tradeoffs of
transparency to cost.  A key issue of developing community networks has not
been how to set up a few nodes here or there, or to provide discount
courses in computer literacy, but in promoting the access to truly useful
information, and in facilitating interconnection.  The information that
needs to be retrieved does not exist in any one place; it is located in
bits and pieces everywhere.  The flattening of hierarchies means that
individuals are demanding more direct access to information, and ways to
turn that information into knowledge and wisdom.  To transmute data into
information, users need to be able to continually contextualize and
recontextualize.  Whereas our books in the past have tried to anticipate
that context, to anticipate the kinds of questions that might be asked, the
more dynamic information retrieval and exchange systems allows the users to
pursue further what he or she does not know.  

	To truly understand this in the context of changing means of
communication, we need to look not just to the people involved in the
communication, or to the institutions of access, but to a new relationship
between them.  It is perhaps too easy to think of this new world and media
of cyberspace in terms of the older, more industrially suited topologies of
hierarchies and centers.  These hierarchies and centers continue to exist
in the world of material movement, in the world of the deployment of
limited resources including capital.  And yet the ongoing media revolution
involving the technologies of organization is remarkably undertheorized,
and some of this might be traced to an oversight of issues of geodesics.
Since identities relate us to our environment, to our communities, so
technologies that change the access and nature of these communities will
profoundly affect these basic identities.  We neglect to consider how these
technologies are changing the identity of those corporations and
individuals who are become accustomed to using it.  Something is happening,
indeed, to the Trans-national Corporations as they seek to decentralize
themselves to become more fluid so as to keep up with the rapidly changing
environment.  To understand what it is, we need to look deeply into the
changing, geodesic topology of the information networks.  And this is
something that planners of community networks need to do as well.

			  Bibliography

Baldwin, Jay. Personal Communication. April 18, 1991. 

Bey, Hakim.  1991.  TAZ: Temporary Autonomous Zone, Ontological Anarchy,
Poetic Terrorism.  Brooklyn: Autonomedia

Bishop, Anne.  To appear. "Preliminary Study of Blacksburg Electronic
Village."

Bush, Randy. 1993 "Fidonet: Technology, Tools, and History." Communications
of the ACM. August 1993.

Carey, James. 1989.  "Technology and Ideology: The Case of the Telegraph."
in Communication as Culture. Winchester, Ma: Unwin Hyman

Chee-Kai, Chin.  1992. "Why Are Resources Free On The Internet?" Paper
distributed through UseNet Newsgroup:news.misc. 22 Dec. 1992 13:45:14 GMT

Cisler, Steve. 1993. Community Computer Networks: Building Electronic
Greenbelts.  Communications of the ACM.

Community Information Exchange. 1993b. "Cix Information Exchange Systems"
available through NSFNet Gopher.

Cook, Gordon.  1992. "NSF audit of MERIT fails to examine MERIT/ANS
relationship."  Cook Report on the Internet. (May 13, 1993)

Cook, Gordon.  1993. "NSFnet "Privatization" and the Public Interest:  Can
Misguided Policy Be Corrected?  An Historical Review of Network Policy
Making - 1989 to the Present With Pointers for Corrective Action." Cook
Report on the Internet. Vol. 1: 10 & 11. (Dec. 31, 1992/Jan 1, 1993)

Cronin, Mary J. 1994. Doing Business on the Internet: How the Electronic
Highway is Transforming American Companies. NY: Van Nostrand Reinhold.

Dodd, Carol Anne. 1992. "What's a FidoNet?: The Growth and Development of
an Amateur Computer Network."  Carleton University Working Papers in Public
Access Networks. (March 1992)

Deutsch, Peter.  1993.  "Peter's Soapbox."  Internet World.
November/December 1993.

Dodd, Carol Anne. 1992. What's a FidoNet?: The Growth and Development of an
Amateur Computer Network.  Carleton University Working Papers in Public
Access Networks. (March 1992)

Edguer, Aydin. 1993. "alt.bbs.internet Frequently asked Questions [FAQ]
(with answers)." (March 28, 1993). Distributed via Internet.

Electronic Frontier Foundation. 1993. Press Release: "Gore Endorses EFF'S
Open Platform Approach."  December 21.

Fast, Adam. 1994. "Telecommunications Competition Act of Washington State"
Posted on Communet, January 14, 1994, 2:39 Central Standard Time.

Featherstone, Mike. 1990.  Global culture: nationalism, globalization, and
modernity.  Theory, culture & society special issue. London: Sage.

Frey, Donnalyn and Rick Adams. 1991. !%@:: A Directory of Electronic Mail
Addressing & Networks. Sebastopol, CA: O'Reilly & Associates. (2nd, rev.
ed.).

Fuller, R. Buckminster. 1974. Synergetics I: Explorations in the Geometry
of Thinking. NY: Macmillan Publishing.

Fuller, R. Buckminster. 1979. Synergetics II: Further Exploration in the
Geometry of Thinking. NY: Macmillan Publishing.

Fuller, R. Buckminster. 1981. Critical Path.  NY: St. Martins

Gilder, George. In Press. Telecosm. NY: Simon & Schuster.

Grundner, Thomas. 1993. "Toward The Formation Of a 'Corporation For Public
Cybercasting'"  Available on-line via the Cleveland Freenet.

Hadden, Susan.  1993. Austin Internet Users Group Meeting. November

Hart, Jeffrey A., Robert R. Reed, and Francois Bar.  1992.  "The building
of the Internet: implications for the future of broadband networks".
Telecommunications Policy:666-689. (November)

Huber, Peter W. 1987.  The geodesic network: 1987 report on competition in
the telephone industry. Washington, D.C: Government Printing Office.
Revised edition 1991.

Huber, Peter. 1994. "The Enduring Myth of the Local Bottleneck."
Washington, DC: Commissioned by Regional Bell Operating Companies. Released
March 14.

Hughes, Dave. 1987. "The Electronic Democracy Debate." Meta-Net BBS, Old
Salon, Topics 121, 153, 288, 372; New Salon 3.  Also, Chariot BBS, Denver,
CO. 1-719-632-3391.

Hughes, David S. and George L. Johnston.  1993.  "The Other Half: The
Training Cost of a National Telecommunications Infrastructure"  Paper
presented at the conference "Public Access to the Internet," JFK School of
Government, May 26-27, 1993.

Kleinrock, L. (1992). Technology issues in the design of NREN.  In Kahin,
B. (Ed.),  Building Information Infrastructure. New York: McGraw-Hill.

Locke, Christopher. 1993. "RFC/FYI - Editorial" in The Internet Business
Journal Commercial Opportunities in the Networking Age. Available on-line.

MacKie-Mason, Jeffrey K. and Hal  R. Varian. 1993. "Pricing the Internet."
Draft of paper presented at the conference "Public Access to the Internet,"
JFK School of Government, May 26-27, 1993.

Markoff, John. 1993. "Traffic James Already on the Information Highway" New
York Times. A1.  Nov. 3.

Morningstar, Chip. 1994. "Re: Telecommunication Competition Act of
Washington State." Posted on Communet, Jan. 14, 1994.

Mosco, Vincent.  1982.  Pushbutton Fantasies: Critical Perspectives on
Videotex and Information Technology. Norwood NJ: Ablex.

Mosco, Vincent and Janet Wasco, eds. 1991. The Political Economy of
Information.  Madison: University of Wisconsin Press.

National Coordination Office for HPCC. 1993. HPCC FY94 "Blue Book" entitled
"High Performance Computing and Communications: Toward a National
Information Infrastructure."  The National Coordination Office for High
Performance Computing and Communications, 8600 Rockville Pike, Bldg
38A/B1N30, Bethesda, MD 20894.

National Public Telecomputer Networks. 1992. "National Public Telecomputing
Network Academy One Program List Of Offerings For School Year 1992-1993."
Available on-line via the Cleveland FreeNet.

Neuman, W. Russell. 1991.  The future of the mass audience. Cambridge:
Cambridge University Press.

Nelson, Ted. 1974. Dream Machines.  (Publisher not stated)

Odasz, Frank. 1993. "Community Economic Development Networks: A
Teleliteracy Primer"  Paper presented at the conference "Public Access to
the Internet," JFK School of Government, May 26-27, 1993.  A more extended
version was also made available to me by the author and was available as "A
Costs Model for Internet Access" distributed on-line through the Consortium
for School Networking Mail List.

Pavlik, John V. and Mark A. Thalhimer. 1994.  "Sizing up Prospects for a
National Information Service." (In Williams 1994.)

Quarterman, John S.  1990. The matrix : computer networks and conferencing
systems worldwide. Bedford, Mass.: Digital Press.

Quarterman, John S. "Networks in Argentina" Matrix News. Vol. 1:8 (Nov.
1991).

Rickert, Jack. 1993. Letters.  Boardwatch Magazine. Littleton, Co. [Rickert
is the editor of Boardwatch]

Rheingold, Howard. 1993.  Virtual Communities. NY: Addison-Wesley.

Sakolsky, Ron and James Koehnline. 1993. Gone to Croaton: Origins of North
American Dropout Culture. Brooklyn: Autonomedia.

Schickele, Sandra. 1993. "The Economic Case for Public Subsidy of the
Internet" Subsidy of the Internet"  Paper presented at the conference
"Public Access to the Internet," JFK School of Government, May 26-27, 1993

Smith, Ben and Jon Udell. 1993. "Linking Lans"  Byte. December 1993.

Sterling, Bruce. 1993. "Pentagon Brainchild, Anarchists' Dream" Magazine of
Fantasy and Science Fiction. Reprinted on-line and in the Public Media
Monitor. Austin, Tx. Fall 1993.

Tibbets, John and Barbara Bernstein. 1993 "Political Primer for Enterprise
Networks."  Byte. December 1993.

Uncapher, Willard. 1991. "Trouble in Cyberspace: Civil Liberties in the
Information Age."  Humanist.

Uncapher, Willard 1993. "Privatizing the Global Telecommunications
Infrastructure: 5 examples." vailable via FTP from
Actlab.rtf.utexas.edu:/export/home/friends/paradox/pub/privatize.

Uncapher, Willard 1994. "Between Local and Global: Placing the Mediascape
in the Transnational Cultural Flow."  Available via FTP and Gopher from
Actlab.rtf.utexas.edu near to the Gopher main menu.

Van Tassel, Joan. 1994. "Yackety-Yak, Do Talk Back."  Wired 2.01. (January)

Wallerstein, Immanuel.  1987. "World Systems Analysis." in A. Giddens and
J. Turner, eds. Social Theory Today. Stanford: Stanford University Press.

Wallerstein, Immanuel.  1990. "Culture as the Ideological Battleground of
the Modern World-System."  in Featherstone.

Williams, Frederick and John V. Pavlik.  1994. The Citizen's Right to Know:
Media, Democracy ad Electronic Information Services.  Hillsdale, NJ:
Lawrence Erlbaum Associates, Publishers.

Yanoff, Scott. 1993. "Inter-Network Mail Guide." Available by anonymous ftp
from csd4.csd.uwm.edu, and distributed widely on the Internet and
.

Back to the Cyberculture Archive