?

Log in

No account? Create an account
Network dynamics - The Mad Schemes of Dr. Tectonic [entries|archive|friends|userinfo]
Beemer

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

Network dynamics [Feb. 10th, 2006|08:08 pm]
Beemer
Saw a very cool talk this morning about dynamics of and on networks.

A large fraction of the talk was an introduction to network theory. The general point was introducing network dynamics as an alternative approach to the study of middle-order complex systems.

Small systems (with only a few interacting pieces), see, you can just do the math for. And really large systems (like, 10^23 interacting molecules) can be handled with statistical mechanics. But systems with too many components to do the math, but not enough to apply stat-mech to, are really, really hard.

However, if you can represent your system as a bunch of nodes connected together in a network and interacting via the links, you can do some cool stuff, because the topology of the network imposes strong constraints on the system behavior regardless of what's actually passing across the links!

He talked about a whole bunch of stuff, but here's one of the neat ideas: network structure determines how big your system can get before it goes unstable. One of the features that large, stable networks have is that they are disassortative -- the high-degree nodes (the hubs) will not be adjacent to each other, they will be separated by low-degree nodes.

Now, a system that's optimized for information transfer, like a social network, needs to be assortative, with the hubs close to one another.

So there's a possible explanation for the "natural size limit" on human communities of about 100 people. It's not an arbitrary limit to how many people a human brain can keep track of, it's that the human brain has never needed to evolve the capacity to keep track of communities larger than that, because they become unstable above that size...

(Note: I'm not saying that you can't keep a system stable above a certain size, because that's obviously not true. I think it's more like once a system has gotten as big as it can get and still be naturally stable, you have to put another system on top of it in order for things to be stable at the new scale. Informal decision-making only works for small companies; once your company gets to a certain size, you have to create formal structures to keep functioning.)

Anyway, there was lots of other neat stuff (like network instability being a possible contributor to the collapse of the Maya), but the assortative limit idea was the thing I had to share.
LinkReply

Comments:
From: nosato
2006-02-10 08:46 pm (UTC)
Interesting. We had a job candidate's talk yesterday, and he just talked about network stuff. I bet you would have enjoyed his talk as well. I just copied the abstract below. You can google his name and you will find his web page on the top of the list (if you are interested).


Tony Grubesic, Assistant Professor, Geography, University of Cincinnati

Spatial Optimization for Identifying Vulnerability in Critical Network Infrastructure

With an increased level of interconnection between critical network infrastructures, the potential for extreme events to generate both primary and secondary network failures is of great concern. For example, the attacks on the World Trade Center, the blackout of August 14, 2003 and the Galaxy IV satellite failure have reinforced the need for a more thorough understanding of vulnerabilities relative to system performance. This presentation will explore the topological complexities associated with network interconnections and highlight the potential impacts of losing critical infrastructure elements that are geographically linked. Specifically, the loss of vital nodes, like telecommunication switching centers, is evaluated utilizing a developed spatial optimization model. Results indicate that certain infrastructure topologies are not particularly well equipped to handle the loss of critical network elements, while others could withstand the loss of vital nodes and maintain an adequate level of service.
(Reply) (Thread)
[User Picture]From: backrubbear
2006-02-11 08:33 am (UTC)
Sounds like the stuff of medium to large sized ISPs. :-)

Most networks of a large size tend to have levels of inherent abstraction. What drives those levels is interesting on its own, but for purposes of this discussion isn't. A given set of nodes and links can effectively be aggregated at a relatively small cluster of nodes and edges. This leads to a "six-degrees" type effect where nodes (backbone in the net, important people in social networks) can reach each other often through someone that is more heavily connected.

But much like other things, those heavily connected people or nodes require less information about the topology of those below them. They often have access to a large amount of summary information, but seldom full topology. This is akin to the area or area/level concept of two of the more common routing protocols that use dijkstra's algorithm.
(Reply) (Thread)
[User Picture]From: jofish22
2006-02-11 12:05 pm (UTC)
Who da speaker? Sounds interesting. The most work I've seen on the "natural size limit" for communities is Robin Dunbar -- he's got a very good book on it with a title with the word Gossip in it somewhere, I forget where. But the idea that the network structure itself becomes unstable after that size is interesting. Very interesting, and I'm not sure it's totally true: in particular, I think you can get around with with multiple intersecting sets of size <=100. Which I think means I'm agreeing with you above.

I went to a talk by Jon Kleinberg the other day on looking at the economics of networked structures (mainly trees for the sake of argument) for [paid] information finding, with middlemen serving as brokers. What's perhaps most interesting about the talk was that it basically works if you have high trust levels and people don't rip off the system -- ie, if I offer you and all my 1st degree contacts $10 to find an answer to my question, then you offer it to your 1st degree contacts for $9 (so you get a $1 finder's fee) then it works, because it gets pretty deep with big spread pretty quickly. If there's low trust and high rip-off factor -- ie I offer $10, you offer $5, and it bottoms out below the utility factor for people to bother to reply (call it $1) too quick to be useful. (Note that for a reasonably well-spread out graph, the last step, whatever it is, probably contains as many new contacts as the total of the tree to date.) Nifty stuff.
(Reply) (Thread)
From: detailbear
2006-02-11 09:34 pm (UTC)
In leadership school, we learned that a congregation transitions from a small church to a medium-sized church at about 75-100 people, and has to begin formalizing structures. At about 150-200 people, the structure doubles, into a policy group and a working group with a formal bridge between. At about 400, it becomes a group of groups — a two tier network.

Just another parallel.
(Reply) (Thread)