Complexity



[...
> There are several ways of defining biological complexity.
>
> Common metrics involve things like counting the number of
> different cell types an organism produces - and estimating
> the "kolmogorov complexity" of its genome.
>
>
> Though the definitions may differ in detail, they tend to
> be correlated - and in discussions like this it tends not
> to matter very much which one you use.
> --
> __________
> |im |yler http://timtyler.org/ [email protected] Remove
> lock to reply.
>

Which is exactly why threads such as this quickly erode into
philosophical debates.

So for example (and in loose reference to one of your other
posts), there are many, many groups of bacteria that
cooperate, differentiate into a multitude of
morphologically distinct cell types, and build elaborate,
macroscopic (differentiated though monoclonal) structures.
[See, for examples, hyphae-producing Streptomyces, akinete
and/or heterocyst forming cyanobacteria, iron-scavenging
Shewanella macrocolonies, or the wondrous assortment of
organisms that comprise e.g. thermophilic, alkaline
microbial mats]. Yet the Kolmogorov complexity of the
genomes from these organisms are indistinguishable, they
all have roughly the same number of genes, the degree of
their metabolic and protein interaction networks are all
scale free (power-law distributed) with nearly identical
scaling exponents, etc. etc. ad nauseum. Find me some
underlying, unifying Standard Model of Complexity that has
evaded notice in all studies to date, and I'll pay all the
publication charges on your paper.

To put it simply, there is no metric that has ever been
proposed, whether it be here, in the work of Gould, Dawkins,
Shannon, Kauffman, etc., or in the scientific literature,
that is able to correlate some /generalizeable complexity
gradient/ with the genomes, genotype, or known evolutionary
trajectory of any prokaryotes. Given any one of the measures
that we might collectively come up with to define
complexity, one ends up with entirely different arrangements
of 'simple->complex' organisms.

> Though the definitions may differ in detail, they tend to
> be correlated - and in discussions like this it tends not
> to matter very much which one you use.

To put it another way, it *only* matters which one you use.

One of the only promising new infusions into this somewhat
dismaying debate, IMO, comes from Wolfram's Principle of
Computational Equivalence -- we have so much trouble
categorizing complexity because there are only two
categories to be found; things are either complex or they
are simple. (Not that I believe this just yet, but it's a
much needed new angle).
 
r norman <rsn_@_comcast.net> wrote in message news:<[email protected]>...
> On Thu, 15 Apr 2004 03:57:36 +0000 (UTC),
> [email protected] (chupacabra) wrote:
>
> >The question that perplexes me - why does evolution
> >progress from the simple to the complex? The simple
> >bacteria and other "primitive" forms of live are by no
> >means less "viable" then more complex forms -- animals
> >and humans. Many of these "primitive" species remain the
> >same for the hundreds of millions of years, survive
> >perfectly in their enviroments and don't need to evolve
> >into the complex forms. Indeed, complex forms are often
> >more fragile and susceptible to the environment
> >perturbations than primitive ones. So how natural
> >selection alone can explain the general vector of
> >evolution - from simple and primitive to more complex
> >forms? Or could there exist some another force apart from
> >the natural selection -- to "push" evolution in the
> >direction of complexity, developed nervous system, self-
> >awareness etc.???
>
> The general idea now is that there is no "progression" of
> evolution towards more and more complex forms. It is
> necessarily true that the original life forms were
> relatively simple. It is also true that we are rather
> complex. So if you look at evolution from the original
> form to us, it does seem like an increase in complexity.
> However, most living things are microorganisms and if you
> look at evolution from the original to a modern bacterium,
> you get a different impression.
>
> Another way to look at it is as a random walk process.
> Evolution tends to spread out organisms in all directions.
> However, it started with simple things and there is a
> lower bound to how simple an organism can be and still be
> alive. So there is necessarily an increase in average
> complexity with time. Still, most things remain simple.

A vivid way of making this same point is to ask for an
explanation of the "remarkable" southward vector of human
migration in the first few millenia after the Berring land
bridge was crossed. What conceivable force drew these early
emigrants enexorably to Tierra del Fuego?

The question seems even more compelling if it is being
investigated by an ethnocentric Fuegan scientist who is
inclined to doubt that the larger, richer populations of the
Vale of Mexico and the Andes are very interesting. (That is,
if you have missed my point, perhaps an objective observer
might see Nature's "progress vector" as leading to the
flowering plants and the insects, rather than to H. sap.)
 
"Tim Tyler" <[email protected]> wrote in message
news:[email protected]...
> Frank Reichenbacher <[email protected]> wrote or
> quoted:
>
> > There is no *general vector* of evolution that
> > determines that
complexity
> > will be favored.
>
> Two effects seem to fit this bill - at least on a
> large scale:
>
> * The progressive accumulation of technology - e.g.
> photosynthesis, haemoglobin, etc;
>
> * Co-evolutionary arms races between large-brained
> organisms;
>
> The first one it the fundamental one. The second one does
> directly produce complexity - but it really relies on the
> first one to explain why it is ultimately favoured.
>
> The fact that nature is constantly learning new tricks -
> but rarely forgets old ones that were any use gives
> evolution an unmistakable progressive character - a
> direction, if you will.
>
> Some would go even further - and say that evolution has a
> "goal" - or at least is behaving as though it has one - by
> consitently heading in a specific direction.
> --
> __________
> |im |yler http://timtyler.org/ [email protected] Remove
> lock to reply.
>

Why did you snip the part where I essentially agreed with
you?

"There is simply the drive to exploit available habitat in
order to avoid competition from existing and diversified
forms. Sometimes this will actually require the organim
becomes more simple, as in the loss of limb structures in
the evolution of marine mammals from terrestrial mammals. It
does seem like the more successfull pathway for the
organism, however, is the development of new functions."

I do not believe that my statement, "There is no *general
vector* of evolution that determines that complexity will be
favored," contradicts this. I think that the argument over
the question of progression in evolution is misguided and
beside the point. Natural selection will favor whatever
gives the organism an advantage. A new structure or function
might do this radically better than attempting to compete
against a diverse array of competitors which have already
attuned existing structures and functions to the competitive
environment.

The human property of syntactical language is an excellent
example. It is arguably the single most important feature of
mankind that resulted in the current explosion of
population. No other organism possesses such a faculty.

Does this mean we were participants in some general drive
toward greater complexity? Or does it mean that the
adaptation was successful only because no other organism
possesses the capacity for syntactical language? Suppose a
multitude of other organisms not in the primate->human clade
possessed syntactical language skills? Would there now be 6+
billion humans? Would our average lifespans have more than
doubled in the past century?

Now, how will another sentient being compete with us? By
evolving better language skills? Or by evolving some
completely new structure or function?

Frank
 
r norman <rsn_@_comcast.net> wrote or quoted:

> A number of people have commented on my statement about
> evolution as a random walk with no specific tendency to
> evolve towards more complexity. Here is a better
> description of what I was trying to explain.
>
> Imagine an abstract "phenotype landscape" spread out, each
> point representing one possible type of organism. Imagine
> it organized by "complexity", something we can't really
> define or measure but we know it when we see it. On one
> side are the simple things, on the far end are the most
> complex. There is a wall on the simple end -- too simple
> and you can't sustain life. We don't know (or haven't
> reached) a wall on the complex end. In the beginning, you
> start with a bunch of cells all bunched along the wall at
> the simple end. Evolution is a random walk. There is no
> specific tendency to get more complex nor is there any
> specific tendency to get less complex. There is only a
> tendency to change, to move from where you are to another
> location. The changes are random in direction; a
> "drunkard's walk". Over time, organisms tend to fill the
> landscape, spreading out over everything. As time goes on,
> organisms spread farther and farther into the complex
> region. The leading edge always gets more and more
> complex. The average always gets more and more complex.
>
> Still, the mechanism of evolution in no way demonstrates a
> tendency to produce complexity. The mechanism of evolution
> is to produce and select for change. The "move towards
> complexity" is simply a product of the diffusion process
> (the random walk) and the boundary conditions (a barrier
> at the simple end) and the initial condition (start
> concentrated at the simple end).

What follows is *all* IMHO:

It isn't the mechanism of evoultion that produces the
tendency towards increased complexity. It's the nature of
the environment.

There has never been any coherent evidence presented
suggesting that any metric of complexity follows a
random walk.

Gould himself didn't seem to think they did in any
particular species. He presented an argument suggesting that
a randomly-chosen lineage was more likely to become more
simple than complex.

This is almost certainly true. Indeed a "randomly-chosen"
lineage is rather likely to become maximally simple -
i.e. extinct.

Gould's speculation suggested the complexity of something
should follow a SQRT(t) curve characteristic of a random
walk bounded by a wall.

Thus, Gould predicted that the increase in complexity is
slowing down.

I'll happily put my name to a completely opposed hypothesis
- "the complexity of life" (with practically whatever
reasonable complexity metric you like) is increasing faster
and faster.

Gould's "modal bacter" is also rather misleading. This isn't
the age of bacteria any more. Bacteria failed to club
together - and consequently got displaced from many
important environments long ago.

Look out of your window and it is the macroscopic
organisms which you will see the most of - colonial
organisms, not loners.

In the current mass extinction, certain large complex
organisms appear to be thriving - while many species of
bacteria are being wiped out (along with organisms in most
other niches).

This is likely to continue - and the eclipse of bacteria
will gather strength as time passes.

More and more of the available biomass will come to be
concentrated in highly-complex organisms. Many of the roles
played by bacteria will come to be played by machines and
processing plants. We won't get rid of pathogens and
parasites completely - but (as today) they will only rarely
get the upper hand.
--
__________
|im |yler http://timtyler.org/ [email protected] Remove
lock to reply.
 
<< Find me some underlying, unifying Standard Model of
Complexity that has evaded notice in all studies to date,
and I'll pay all the publication charges on your paper. >>

Heat - energy moderation, (mostly at the high and most
damaging end of temperature) What chemistry system that
doesn't survive the heat cycle, doesn't exist. What does
= life chemistry.

Life= energy moderation with modification through
descent. (It's obviously heat or matter - and heat's the
better choice)

Life didn't pop and then adapt, it was a response to a heat
cycle (the other factors , gravity, pressure, pH, etc. don't
come near the effect of heat)

The metabolism side of life is obviously a reaction to heat
- that's what metabolism is. (First replication was probably
just a novel way to survive heat another day)

No life outside the bounds of liquid water 0-100C - thus to
insure survival all life must stay within temp bounds. And
the closer it gets to its optimum temp the better its
enzymes can perform - which are there to enhance chemistry
at that temp in the first place.

Life is a non-random response to a non random heat cycle in
the environment.
 
in article [email protected], phillip smith at
[email protected] wrote on 17/4/04 3:49 AM:

> in article [email protected], chupacabra
> at [email protected] wrote on 15/4/04 3:57 PM:
>
>> The question that perplexes me - why does evolution
>> progress from the simple to the complex? The simple
>> bacteria and other "primitive" forms of live are by no
>> means less "viable" then more complex forms -- animals
>> and humans. Many of these "primitive" species remain the
>> same for the hundreds of millions of years, survive
>> perfectly in their enviroments and don't need to evolve
>> into the complex forms. Indeed, complex forms are often
>> more fragile and susceptible to the environment
>> perturbations than primitive ones. So how natural
>> selection alone can explain the general vector of
>> evolution - from simple and primitive to more complex
>> forms? Or could there exist some another force apart from
>> the natural selection -- to "push" evolution in the
>> direction of complexity, developed nervous system, self-
>> awareness etc.??
>
>
> My theory is explained here
>
> ed-evolution.co.nz/selfishH/selfish_helper.ssi
>
> I have since I wrote this paper added some more aspects
> but they have not been published either in journals or to
> the web as et

Oops url was wrong

Should be www.applied-
evolution.co.nz/selfishH/selfish_helper.ssi

Basically Parasitic complexity is an adaptive response to
genetic load

It is driven by natural selection but may be considered an
artifact of all evolutionary systems. It is not confined
to biological evolution, we see this phenomenon in our
every day life

--

Phillip Smith phills@(buggger).co.nz replace bugger with
ihug http://www.applied-evolution.co.nz

"he who is smeared with blubber has the kindest heart" -- a
Greenland Eskimo adage
 
[...
> The bacteria are disadvantaged - since they can't easily
> cooperate with one another, and build large structures -
> and such cooperation seems to pay off.
>
> They have been (literally) overshadowed - and relegated
> to the nooks and crannies of the world. These days much
> of the work gets done by macroscopic organisms - such
> as trees.
>
> Once bacteria ruled the world - but now they in the middle
> of a period of decline. Their decline seems likely to
> continue - as much of the world's chemical processing gets
> taken over by machines - who will have stolen the
> bacteria's enzymatic secrets from their genomes.
> --
> __________
> |im |yler http://timtyler.org/ [email protected] Remove
> lock to reply.
>

This anthropocentric misinformation that you've slipped in
at multiple points in this thread has no factual basis and
no support in real science. There have been at least half a
dozen articles in just the past few years in Science and
Nature alone that point towards an inverse scale free
relationship between organism size and organism density.
That is, the very smallest organisms outnumber slightly
larger organisms by orders of magnitude, and outnumber even
larger organisms by many more orders of magnitude. In fact
the largest organisms on Earth (ourselves among them) make
up only a minute fraction of the global biomass.

Wipe out every multicellular organism on Earth and single-
celled life will barely take notice. Take out the single-
celled microbial contingency and every ecosystem on
Earth will collapse in short order. We are pawns in a
microbial world.

[See e.g. Marquet, Invariants, Scaling Laws, and Ecological
Complexity. Science 289, 1487-8, for a good commentary and
chain-of-reference jumpstart.]
 
IRR <[email protected]> wrote or quoted:

> So for example (and in loose reference to one of your
> other posts), there are many, many groups of bacteria that
> cooperate, differentiate into a multitude of
> morphologically distinct cell types, and build elaborate,
> macroscopic (differentiated though monoclonal) structures.
> [See, for examples, hyphae-producing Streptomyces, akinete
> and/or heterocyst forming cyanobacteria, iron-scavenging
> Shewanella macrocolonies, or the wondrous assortment of
> organisms that comprise e.g. thermophilic, alkaline
> microbial mats]. Yet the Kolmogorov complexity of the
> genomes from these organisms are indistinguishable, they
> all have roughly the same number of genes, the degree of
> their metabolic and protein interaction networks are all
> scale free (power-law distributed) with nearly identical
> scaling exponents, etc. etc. ad nauseum. Find me some
> underlying, unifying Standard Model of Complexity that has
> evaded notice in all studies to date, and I'll pay all the
> publication charges on your paper.

The best-established way of measuring complexity is to use
Kolmogorov complexity.

This:

* Confines discussion to digital phenomena;
* Is difficult to measure;
* Has a subjective element - since it depends on a choice of
descriptive language.

The first is no problem, if we are content to confine
ourselves to the complexity of genomes.

The second is a problem in theory:

* Except in a few trivial cases, you can only put bounds on
the metric - rather than measure it exactly (and even then
the lower bound is rarely much use). I would suggest
ignoring this problem - and measuring the value using a
conventional high-quality compressor of a type that is
capable of dealing well with repeated sequences.

...and in practice...

* You need to sequence the genome in question before you can
measure its complexity;

The third makes the metric less asethetically
attractive. My approach would probably be to say
something along the lines of:

"Always use FORTRAN-77 as your language".

> To put it simply, there is no metric that has ever been
> proposed, whether it be here, in the work of Gould,
> Dawkins, Shannon, Kauffman, etc., or in the scientific
> literature, that is able to correlate some /generalizeable
> complexity gradient/ with the genomes, genotype, or known
> evolutionary trajectory of any prokaryotes. Given any one
> of the measures that we might collectively come up with to
> define complexity, one ends up with entirely different
> arrangements of 'simple->complex' organisms.

I would characterise the effect as "broad agreement" -
rather than "broad disagreement".

> > Though the definitions may differ in detail, they tend
> > to be correlated - and in discussions like this it tends
> > not to matter very much which one you use.
>
> To put it another way, it *only* matters which one
> you use.

In /principle/, you can generate Kolmogorov complexity
metrics that give whatever answers you like for the
complexities of different organisms - by varying the
descriptive language used to measure it.

...but in practice, complexity measures broadly agree - IMO.
If they didn't broadly correspond to what people think of as
complexity, they would not be accepted as metrics of
complexity in the first place.

> One of the only promising new infusions into this somewhat
> dismaying debate, IMO, comes from Wolfram's Principle of
> Computational Equivalence -- we have so much trouble
> categorizing complexity because there are only two
> categories to be found; things are either complex or they
> are simple.

That's not really what Wolfram is saying. What he's saying
is more like there are only two sorts of system capable of
performing computation - if you choose to completely ignore
some basic factors like speed, relative performance of
different operations, cost, efficiency, and memory capacity.

Complexity - as everyone agrees - is a scalar property, not
a binary one.
--
__________
|im |yler http://timtyler.org/ [email protected] Remove
lock to reply.
 
On Sat, 17 Apr 2004 04:29:51 +0000 (UTC), [email protected]
(Jim Menegay) wrote:

>r norman <rsn_@_comcast.net> wrote in message
>news:<[email protected]>...
>> On Thu, 15 Apr 2004 03:57:36 +0000 (UTC),
>> [email protected] (chupacabra) wrote:
>>
>> >The question that perplexes me - why does evolution
>> >progress from the simple to the complex? The simple
>> >bacteria and other "primitive" forms of live are by no
>> >means less "viable" then more complex forms -- animals
>> >and humans. Many of these "primitive" species remain the
>> >same for the hundreds of millions of years, survive
>> >perfectly in their enviroments and don't need to evolve
>> >into the complex forms. Indeed, complex forms are often
>> >more fragile and susceptible to the environment
>> >perturbations than primitive ones. So how natural
>> >selection alone can explain the general vector of
>> >evolution - from simple and primitive to more complex
>> >forms? Or could there exist some another force apart
>> >from the natural selection -- to "push" evolution in the
>> >direction of complexity, developed nervous system, self-
>> >awareness etc.???
>>
>> The general idea now is that there is no "progression" of
>> evolution towards more and more complex forms. It is
>> necessarily true that the original life forms were
>> relatively simple. It is also true that we are rather
>> complex. So if you look at evolution from the original
>> form to us, it does seem like an increase in complexity.
>> However, most living things are microorganisms and if you
>> look at evolution from the original to a modern
>> bacterium, you get a different impression.
>>
>> Another way to look at it is as a random walk process.
>> Evolution tends to spread out organisms in all
>> directions. However, it started with simple things and
>> there is a lower bound to how simple an organism can be
>> and still be alive. So there is necessarily an increase
>> in average complexity with time. Still, most things
>> remain simple.
>
>A vivid way of making this same point is to ask for an
>explanation of the "remarkable" southward vector of human
>migration in the first few millenia after the Berring land
>bridge was crossed. What conceivable force drew these early
>emigrants enexorably to Tierra del Fuego?
>
>The question seems even more compelling if it is being
>investigated by an ethnocentric Fuegan scientist who is
>inclined to doubt that the larger, richer populations of
>the Vale of Mexico and the Andes are very interesting.
>(That is, if you have missed my point, perhaps an
>objective observer might see Nature's "progress vector" as
>leading to the flowering plants and the insects, rather
>than to H. sap.)

I like it! A very nice example.
 
"Tim Tyler" <[email protected]> wrote in message
>
> Gould's argument was that *complexity* followed a
> random walk.
>
> This criticism doesn't apply to that claim.
>
> It /would/ apply to the claim that evolution - or genomes
> - followed a random walk - but Gould never made those
> assertions in the first >
place.
>
The problem is, what do we mean by complexity? We could take
size. This is highly significant for any animal's adaptive
strategy, and there are many cases of animals becoming
smaller as a result of evolutionary pressure as well as of
animals becoming bigger. If we ignore the local pressures on
size then we do have something that looks superficially like
a "random walk" starting from a single-celled base. What
about genome size? For multi-celled eukaryotes the energetic
cost of carrying extra DNA is trivial. However we don't
understand the function, if any, of the non-coding DNA. So
genome sizes may be taking a true random walk. However what
when we restrict ourselves to DNA that does something
(expressed genome size)? Now any random change beyond a
small level will almost certainly be fatal to the organism
involved. Only an evolutionary ratchet with beneficial
changes providing the basis for more beneficial changes can
drive big alterations in expressed genome size. Now a
beneficial change could either simplify or complicate the
genome. However if it simplifies then it does so by undoing
a change which was previously beneficial. Sometimes this
will happen, but this is not a random walk, as with the body
size example, because simplification does not have a direct
adaptive significance, but only because of the effect on the
phenotype. Evolution seldom acts on a "few proteins"
organism directly, so we don't have the body size situation
with niches appearing for organisms of different sizes. Only
if we propose that simplifications and complifications are
exactly and necessarily balanced in their adaptive advantage
would you have random walk characteristics.
 
On Sat, 17 Apr 2004 04:29:54 +0000 (UTC), Tim Tyler <[email protected]>
wrote:

<snip most of the discussion>

>Look out of your window and it is the macroscopic
>organisms which you will see the most of - colonial
>organisms, not loners.
>
>In the current mass extinction, certain large complex
>organisms appear to be thriving - while many species of
>bacteria are being wiped out (along with organisms in most
>other niches).
>
>This is likely to continue - and the eclipse of bacteria
>will gather strength as time passes.
>
>More and more of the available biomass will come to be
>concentrated in highly-complex organisms. Many of the roles
>played by bacteria will come to be played by machines and
>processing plants. We won't get rid of pathogens and
>parasites completely - but (as today) they will only rarely
>get the upper hand.

You conveniently snipped out the relevant portion of my post
which said:

The existence today of a huge variety of less complex
organisms (most genomic variability lies in the prokaryotes)
shows that there are enormous numbers of habitats and niches
where simple shows a high degree of fitness. On the other
hand, there are enormous numbers of niches where complex
shows higher fitness. We rather large terrestrial organisms
tend to focus on the latter, completely overlooking the
former. If we were microscopic aquatic organisms, we might
have a different perspective.

When you say "look out of your window" you display the
philosophical blinders I refer to. It is not at all clear
that bacteria are being wiped out more than macroorganisms.
It is not at all clear that more and more of the available
biomass will come to be concentrated in highly-complex
organisms. In your own body, the vast majority of cells are
bacterial, they outnumber our own cells. They are much
smaller so we predominate in biomass. But when you add the
biomass of the free-living aquatic bacteria and then through
in the enormous biomass of the newly discovered archaea in
geological formations where no "complex" organism can
survive, then they probably win even in biomass.

The fact of the matter is that if all of the bacteria should
ever disappear, we large things would all die in very short
order. If all of us large, complex things should ever
disappear, the bacteria would rejoice and prosper.
 
"TomHendricks474" <[email protected]> wrote in message
news:[email protected]...
> << Find me some underlying, unifying Standard Model of
> Complexity that has evaded notice in all studies to date,
> and I'll pay all the publication charges on your paper. >>
>
>
> Heat - energy moderation, (mostly at the high and most
> damaging end of temperature) What chemistry system that
> doesn't survive the heat cycle, doesn't exist. What
does
> = life chemistry.
>
> Life= energy moderation with modification through
> descent. (It's obviously heat or matter - and heat's the
> better choice)
>
> Life didn't pop and then adapt, it was a response to a
> heat cycle (the other factors , gravity, pressure,
pH,
> etc. don't come near the effect of heat)
>
> The metabolism side of life is obviously a reaction to
> heat - that's what metabolism is. (First replication was
> probably just a novel way to survive heat another day)
>
> No life outside the bounds of liquid water 0-100C - thus
> to insure
survival all
> life must stay within temp bounds. And the closer it gets
> to its optimum
temp
> the better its enzymes can perform - which are there to
> enhance chemistry
at
> that temp in the first place.
>
> Life is a non-random response to a non random heat
> cycle in the
environment.

Maybe circa 3.8 billion years ago, but any record of such
early principles-in-action has been entirely overprinted by
subsequent life.

It's romantic to think that, within some relict "microbial
Galapagos" there may still exist some primordial population
of cells with only catalytic, self-replicating RNA's to do
their tricks with, or that the proper ingredients exist in
some odd habitat (maybe on Stanley Miller's benchtop?) so
that biogenesis could again occur in modern times. My guess
is that if something like this was the case, extant
organisms would make a quick lunch of it and leave not a
trace. Shame we're all alone as a species in our scientific
endeavors.
 
"Tim Tyler" <[email protected]> wrote in message
news:[email protected]...
> IRR <[email protected]> wrote or quoted:
>
> > So for example (and in loose reference to one of your
> > other posts),
there
> > are many, many groups of bacteria that cooperate,
> > differentiate into a multitude of morphologically
> > distinct cell types, and build elaborate, macroscopic
> > (differentiated though monoclonal) structures. [See, for
> > examples, hyphae-producing Streptomyces, akinete and/or
> > heterocyst
forming
> > cyanobacteria, iron-scavenging Shewanella macrocolonies,
> > or the wondrous assortment of organisms that comprise
> > e.g. thermophilic, alkaline
microbial
> > mats]. Yet the Kolmogorov complexity of the genomes
> > from these
organisms
> > are indistinguishable, they all have roughly the same
> > number of genes,
the
> > degree of their metabolic and protein interaction
> > networks are all scale free (power-law distributed) with
> > nearly identical scaling exponents,
etc.
> > etc. ad nauseum. Find me some underlying, unifying
> > Standard Model of Complexity that has evaded notice in
> > all studies to date, and I'll pay
all
> > the publication charges on your paper.
>
> The best-established way of measuring complexity is to use
> Kolmogorov complexity.
>
> This:
>
> * Confines discussion to digital phenomena;
> * Is difficult to measure;
> * Has a subjective element - since it depends on a choice
> of descriptive language.
>
> The first is no problem, if we are content to confine
> ourselves to the complexity of genomes.
>
> The second is a problem in theory:
>
> * Except in a few trivial cases, you can only put bounds
> on the metric - rather than measure it exactly (and even
> then the lower bound is rarely much use). I would
> suggest ignoring this problem - and measuring the value
> using a conventional high-quality compressor of a type
> that is capable of dealing well with repeated sequences.
>
> ...and in practice...
>
> * You need to sequence the genome in question before you
> can measure its complexity;
>
> The third makes the metric less asethetically attractive.
> My approach would probably be to say something along the
> lines of:
>
> "Always use FORTRAN-77 as your language".
....]

IMO this third problem -- choosing a language with which to
quantify complexity -- is still *the* showstopper when it
comes to biology. While we might all agree that the primate
brain is an incredibly complex organ, it's not at all agreed
upon what it is we mean by this. For example, a Kolmogorov
measure fails miserably in classifying the brain as complex,
after all you're really only talking about two dozen or so
different recognized cell types stamped out in enormous
repetition with iterated connections between them -- in
other words, a digital representation of the brain is
incredibly compressible. I can think of examples with a
fraction of the genes present in a human brain cell that can
give rise to at least that many different recognizeable cell
types, and I can just as easily think of organisms with a
vastly larger proteome that show none of the interactions or
differentiation of neural progenitor cells.

In my mind the obstacle here is that we have not yet rightly
/defined/ the problem. Our inherent notion of what is
complex is prominently if not exclusively based on (as
evident in my examples) visual cues, but prominent
biological observables such as cell morphology have been --
at least thus far -- incredibly poor indicators of
complexity.
 
IRR <[email protected]> wrote or quoted:

> > The bacteria are disadvantaged - since they can't easily
> > cooperate with one another, and build large structures -
> > and such cooperation seems to pay off.
> >
> > They have been (literally) overshadowed - and relegated
> > to the nooks and crannies of the world. These days much
> > of the work gets done by macroscopic organisms - such as
> > trees.
> >
> > Once bacteria ruled the world - but now they in the
> > middle of a period of decline. Their decline seems
> > likely to continue - as much of the world's chemical
> > processing gets taken over by machines - who will have
> > stolen the bacteria's enzymatic secrets from their
> > genomes.
>
> This anthropocentric misinformation that you've slipped in
> at multiple points in this thread has no factual basis and
> no support in real science. There have been at least half
> a dozen articles in just the past few years in Science and
> Nature alone that point towards an inverse scale free
> relationship between organism size and organism density.
> That is, the very smallest organisms outnumber slightly
> larger organisms by orders of magnitude, and outnumber
> even larger organisms by many more orders of magnitude.

Well, yes - becuase they are so tiny. I am outnumbered millions-to-
one by my own gut bacteria - but that doesn't mean that they
are more important than I am.

> In fact the largest organisms on Earth (ourselves among
> them) make up only a minute fraction of the global
> biomass.

A *lot* of the global biomass is in the form of trees -
e.g.:

``The ongoing enrichment of the atmosphere with CO2 raises
the question of whether growth of forest trees, which
represent close to 90% of the global biomass carbon, is
still carbon limited at current concentrations of close to
370 p.p.m.''

...most of which are much bigger than us.

> Wipe out every multicellular organism on Earth and single-
> celled life will barely take notice. Take out the single-
> celled microbial contingency and every ecosystem on
> Earth will collapse in short order. We are pawns in a
> microbial world.

I'm currently totally dependent on my gut bacteria - while
at least some of them could survive without me.

...but again, that doesn't mean my gut bacteria are more
important than I am.

Many - and maybe most - single-celled organisms have had
their resources stolen by multi-cellular organisms.

The landscape has thus shifted away from single-celled
organisms and towards multi-cellular ones - and this shift
will continue.

In the future, there may still be many single-celled
organisms performing many of the world's important chemical
transformations - but they are likely to be sterile agents
produced in factories - rather than unruly free agents.

The factory is in a better position to do things like weave
different disease resitance genes into every single organism
it creates - and employ intelligent design in their
production. process.

Basically, those atttempting a free-living lifestyle - and
only interfacing to the rest of the world economically -
won't have the even the teeniest remote hope of survival in
such a form - in the face of a huge global cooperative
living organism competing for the same resources as them -
and able to compete by using direct competitor organisms
with no overhead of reproductive machinery, and the ability
to utilise genetic engineering in their design.

Life arose out of a microbial world - but the future lies
with more communal organisms. Most of the bacteria of the
world will not survive in into such a world (and numerically
most of them are probably already gone).

Many of the enzymatic discoveries in their genomes will live
on, though - it will often be simpler to steal them than to
reinvent them.

Bacteria are life's red carpet:

They are rolled out first - but then trodden underfoot ;-)
--
__________
|im |yler http://timtyler.org/ [email protected] Remove
lock to reply.
 
irr <[email protected]> wrote or quoted:
> "Tim Tyler" <[email protected]> wrote in message
> > IRR <[email protected]> wrote or quoted:

> > The best-established way of measuring complexity is to
> > use Kolmogorov complexity.
> >
> > This:
> >
> > * Confines discussion to digital phenomena;
> > * Is difficult to measure;
> > * Has a subjective element - since it depends on a
> > choice of descriptive language.
> >
> > The first is no problem, if we are content to confine
> > ourselves to the complexity of genomes.
> >
> > The second is a problem in theory:
> >
> > * Except in a few trivial cases, you can only put bounds
> > on the metric - rather than measure it exactly (and
> > even then the lower bound is rarely much use). I would
> > suggest ignoring this problem - and measuring the
> > value using a conventional high-quality compressor of
> > a type that is capable of dealing well with repeated
> > sequences.
> >
> > ...and in practice...
> >
> > * You need to sequence the genome in question before you
> > can measure its complexity;
> >
> > The third makes the metric less asethetically
> > attractive. My approach would probably be to say
> > something along the lines of:
> >
> > "Always use FORTRAN-77 as your language".
> ....]
>
> IMO this third problem -- choosing a language with which
> to quantify complexity -- is still *the* showstopper when
> it comes to biology.

I like the answer I gave.

I almost always give this answer.

So far - IMO - I have had no serious complaints ;-)

There may be a few even more "unbiased" languages out there
- but FORTRAN-77 is convenient enough.

> While we might all agree that the primate brain is an
> incredibly complex organ, it's not at all agreed upon what
> it is we mean by this. For example, a Kolmogorov measure
> fails miserably in classifying the brain as complex, after
> all you're really only talking about two dozen or so
> different recognized cell types stamped out in enormous
> repetition with iterated connections between them -- in
> other words, a digital representation of the brain is
> incredibly compressible.

IMO - this makes no sense at all :-|

An acceptable digital version of the brain would handle the
same I/O - and produce similar inputs from similar outputs.
This sounds like a job for a huge computer with an
*extremely* lengthy description to me - and of course a
correspondingly enormous Kolmogorov complexity.
--
__________
|im |yler http://timtyler.org/ [email protected] Remove
lock to reply.
 
r norman <rsn_@_comcast.net> wrote in message news:<[email protected]>...
> On Sat, 17 Apr 2004 04:29:51 +0000 (UTC),
> [email protected] (Jim Menegay) wrote:
>
> >r norman <rsn_@_comcast.net> wrote in message
> >news:<[email protected]>...
> >> On Thu, 15 Apr 2004 03:57:36 +0000 (UTC),
> >> [email protected] (chupacabra) wrote:
> >> >The question that perplexes me - why does evolution
> >> >progress from the simple to the complex?

> >> The general idea now is that there is no "progression"
> >> of evolution towards more and more complex forms.
> >> Another way to look at it is as a random walk process.
> >> Evolution tends to spread out organisms in all
> >> directions. However, it started with simple things and
> >> there is a lower bound to how simple an organism can be
> >> and still be alive. So there is necessarily an increase
> >> in average complexity with time. Still, most things
> >> remain simple.

> >A vivid way of making this same point is to ask for an
> >explanation of the "remarkable" southward vector of human
> >migration in the first few millenia after the Berring
> >land bridge was crossed. What conceivable force drew
> >these early emigrants enexorably to Tierra del Fuego?

> I like it! A very nice example.

Thank you. But I must tell you that my example was only
intended to illustrate my understanding of your idea, not my
endorsement. In fact, I believe that your idea, which I will
refer to as the "diffusion theory", is either incoherent,
unhelpful, unscientific, or contradicted by the evidence.

The theory would be incoherent if it were expressed, as it
sometimes is, in the form: "It is impossible to define
'complexity', and even if you could define it, we already
understand it". We will therefore assume that your version
of the theory is not incoherent - that you accept that
complexity may be definable, though consensus has not yet
been achieved on the exact definition.

The theory would be unhelpful if it were used to discourage
investigation into the best definition of 'complexity' and
the collection of empirical evidence. We will assume that
you do not wish to be unhelpful.

In fact, I will assume that you agree that one useful metric
for complexity is the information content of the genome -
that is, the size of the genome adjusted downward to account
for those sites where drift is unconstrained. This is
roughly Kolmogoroff complexity, I think.

The theory is unscientific if it is maintainable regardless
of what the empirical evidence actually shows. I don't
particularly relish using the P-word, but a scientific
theory has to make predictions that can conceivably be
refuted by the evidence. I wonder whether you would agree
with me that the diffusion theory makes the following
specific predictions:

1. The rate of max complexity increase decreases with
geological time. This is because it is a
characteristic of a random-walk mechanism that the
square of the max complexity measure would be expected
to grow only linearly.
2. Looking at a single species, it is just as likely to take
a step toward lower complexity as toward higher
complexity.
3. Out of all possible organisms that might be viable in a
particular ecological niche, it is the least complex one
that is likely to sieze the niche first.
4. When an appropriate weighting scheme is used that gives
appropriate weight to the bacteria and other simple
organisms, it is seen that average complexity is not very
high (obviously), and it is not really increasing. The
typical modern bacteria is not more complex than its
ancestor of 2Gy ago. The typical modern protozoa is not
more complex than its ancestor of 1Gy ago. Or, if you
focus on a particular species and niche, the modern
cyanobacterium is not more complex than its ancestors.

The theory appears to me to be not borne out by the facts.
5. The leading edge taxa in the rise of complexity - the
metazoa and the metaphytes - appear to have increased
their complexity more within the last 250My than in the
250My before, and much more than in the 500My before
that. Complexity increase among the already complex seems
to be accelerating.
6. There appears to be a kind of Cope's law of complexity
- complexity tends to increase in almost all branches
of a taxon.
7. It seems clear to me that while natural selection may be
an economic optimizer, it is not an optimizer of
organizational structure. Successful organisms are much
more complex than they absolutely need to be.
8. You seem to believe that the simplest organisms have not
increased much in complexity over the past 1-2Gy. I
suspect that they may have doubled in complexity. Our
conflicting intuitions may be testable using comparative
genomics - though at the present time we don't have
enough "branchiness" in our phylogenetic trees to
reliably reconstruct models of the ancestors of modern
microorganisms. The phenomenon of lateral gene transfer
also creates problems. So, this disagreement may remain
unresolved for a long time.

For all of these reasons, I think that the phenomenon of the
increase in complexity is a real one which deserves an
explanation. However, I definitely do not believe, as some
people seem to, that the explanation is to be found in the
mathematics of "complex system dynamics" and will be
revealed to us in the next book by Prigogine, Kauffman,
Wolfram, or Chaisson.

I think we already have an adequate biological understanding
of why neoDarwinist evolution leads to increased complexity
over time. We know that increased complexity arises because
"duplication and divergence of function" is the mechanism
that is available. This mechanism is used at both the level
of the genes and at the level of morphogenesis. By contrast,
there is no simple way for a working organism to shift to a
lower complexity. To use the terminology of business
administration, NS is just not very good at "re-engineering"
the organism to, for example, find commonality between two
functions and reorganize/simplify the way that functionality
is delivered.

So, how does my idea stand up against my criteria of
coherence, usefulness, refutability, and correspondence with
the facts? Well, I hope it isn't incoherent, but I must
admit that it has a "shut down the debate" attitude that is
unhelpful. I will leave it to more expert critics to
evaluate it regarding refutation and verification.
 
irr <[email protected]> wrote:

> "Tim Tyler" <[email protected]> wrote in message
> news:[email protected]...
> > IRR <[email protected]> wrote or quoted:
> >
> > > So for example (and in loose reference to one of your
> > > other posts),
> there
> > > are many, many groups of bacteria that cooperate,
> > > differentiate into a multitude of morphologically
> > > distinct cell types, and build elaborate, macroscopic
> > > (differentiated though monoclonal) structures. [See,
> > > for examples, hyphae-producing Streptomyces, akinete
> > > and/or heterocyst
> forming
> > > cyanobacteria, iron-scavenging Shewanella
> > > macrocolonies, or the wondrous assortment of organisms
> > > that comprise e.g. thermophilic, alkaline
> microbial
> > > mats]. Yet the Kolmogorov complexity of the genomes
> > > from these
> organisms
> > > are indistinguishable, they all have roughly the same
> > > number of genes,
> the
> > > degree of their metabolic and protein interaction
> > > networks are all scale free (power-law distributed)
> > > with nearly identical scaling exponents,
> etc.
> > > etc. ad nauseum. Find me some underlying, unifying
> > > Standard Model of Complexity that has evaded notice in
> > > all studies to date, and I'll pay
> all
> > > the publication charges on your paper.
> >
> > The best-established way of measuring complexity is to
> > use Kolmogorov complexity.
> >
> > This:
> >
> > * Confines discussion to digital phenomena;
> > * Is difficult to measure;
> > * Has a subjective element - since it depends on a
> > choice of descriptive language.
> >
> > The first is no problem, if we are content to confine
> > ourselves to the complexity of genomes.
> >
> > The second is a problem in theory:
> >
> > * Except in a few trivial cases, you can only put bounds
> > on the metric - rather than measure it exactly (and
> > even then the lower bound is rarely much use). I would
> > suggest ignoring this problem - and measuring the
> > value using a conventional high-quality compressor of
> > a type that is capable of dealing well with repeated
> > sequences.
> >
> > ...and in practice...
> >
> > * You need to sequence the genome in question before you
> > can measure its complexity;
> >
> > The third makes the metric less asethetically
> > attractive. My approach would probably be to say
> > something along the lines of:
> >
> > "Always use FORTRAN-77 as your language".
> ....]
>
> IMO this third problem -- choosing a language with which
> to quantify complexity -- is still *the* showstopper when
> it comes to biology. While we might all agree that the
> primate brain is an incredibly complex organ, it's not at
> all agreed upon what it is we mean by this. For example, a
> Kolmogorov measure fails miserably in classifying the
> brain as complex, after all you're really only talking
> about two dozen or so different recognized cell types
> stamped out in enormous repetition with iterated
> connections between them -- in other words, a digital
> representation of the brain is incredibly compressible. I
> can think of examples with a fraction of the genes present
> in a human brain cell that can give rise to at least that
> many different recognizeable cell types, and I can just as
> easily think of organisms with a vastly larger proteome
> that show none of the interactions or differentiation of
> neural progenitor cells.
>
> In my mind the obstacle here is that we have not yet
> rightly /defined/ the problem. Our inherent notion of what
> is complex is prominently if not exclusively based on (as
> evident in my examples) visual cues, but prominent
> biological observables such as cell morphology have been
> -- at least thus far -- incredibly poor indicators of
> complexity.

Chaitin suggests that you pick a suitably comprehensive and
capable language - he uses LISP - and stick with it. So long
as you can express or model what you are trying to
comparatively measure, then the results are notionally
commensurate. But the problem here is that this makes
"complexity" a measure of what we can *say* about something.
And this seems right to me - complexity is the surprisal
value, as it were, of our descriptions of things. The
relative scales will change as we are able to describe more
and more, unless we artificially restrict ourselves to some
aspect of the things, like the DNA (introns or exons?
Expressed or regulatory? Methylation or not?) sequences of
organisms, in which case complexity is a measure of some
theoretical quantity and is *still* an abstract notion. I
don't see a way to avoid measuring only what we know about
things in this regard.
--
John Wilkins [email protected]
http://www.wilkins.id.au "Men mark it when they hit, but do
not mark it when they miss"
- Francis
Bacon
 
On Sun, 18 Apr 2004 21:46:44 +0000 (UTC), [email protected]
(Jim Menegay) wrote:

>r norman <rsn_@_comcast.net> wrote in message
>news:<[email protected]>...
>> On Sat, 17 Apr 2004 04:29:51 +0000 (UTC),
>> [email protected] (Jim Menegay) wrote:
>>
>> >r norman <rsn_@_comcast.net> wrote in message
>> >news:<[email protected]>...
>> >> On Thu, 15 Apr 2004 03:57:36 +0000 (UTC),
>> >> [email protected] (chupacabra) wrote:
>> >> >The question that perplexes me - why does evolution
>> >> >progress from the simple to the complex?
>
>> >> The general idea now is that there is no "progression"
>> >> of evolution towards more and more complex forms.
>> >> Another way to look at it is as a random walk process.
>> >> Evolution tends to spread out organisms in all
>> >> directions. However, it started with simple things and
>> >> there is a lower bound to how simple an organism can
>> >> be and still be alive. So there is necessarily an
>> >> increase in average complexity with time. Still, most
>> >> things remain simple.
>
>> >A vivid way of making this same point is to ask for an
>> >explanation of the "remarkable" southward vector of
>> >human migration in the first few millenia after the
>> >Berring land bridge was crossed. What conceivable force
>> >drew these early emigrants enexorably to Tierra del
>> >Fuego?
>
>> I like it! A very nice example.
>
>Thank you. But I must tell you that my example was only
>intended to illustrate my understanding of your idea, not
>my endorsement. In fact, I believe that your idea, which I
>will refer to as the "diffusion theory", is either
>incoherent, unhelpful, unscientific, or contradicted by the
>evidence.
>
>The theory would be incoherent if it were expressed, as it
>sometimes is, in the form: "It is impossible to define
>'complexity', and even if you could define it, we already
>understand it". We will therefore assume that your version
>of the theory is not incoherent - that you accept that
>complexity may be definable, though consensus has not yet
>been achieved on the exact definition.
>
>The theory would be unhelpful if it were used to discourage
>investigation into the best definition of 'complexity' and
>the collection of empirical evidence. We will assume that
>you do not wish to be unhelpful.
>
>In fact, I will assume that you agree that one useful
>metric for complexity is the information content of the
>genome - that is, the size of the genome adjusted downward
>to account for those sites where drift is unconstrained.
>This is roughly Kolmogoroff complexity, I think.
>
>The theory is unscientific if it is maintainable regardless
>of what the empirical evidence actually shows. I don't
>particularly relish using the P-word, but a scientific
>theory has to make predictions that can conceivably be
>refuted by the evidence. I wonder whether you would agree
>with me that the diffusion theory makes the following
>specific predictions:
>
>1. The rate of max complexity increase decreases with
> geological time. This is because it is a characteristic
> of a random-walk mechanism that the square of the max
> complexity measure would be expected to grow only
> linearly.
>2. Looking at a single species, it is just as likely to
> take a step toward lower complexity as toward higher
> complexity.
>3. Out of all possible organisms that might be viable in a
> particular ecological niche, it is the least complex one
> that is likely to sieze the niche first.
>4. When an appropriate weighting scheme is used that gives
> appropriate weight to the bacteria and other simple
> organisms, it is seen that average complexity is not
> very high (obviously), and it is not really increasing.
> The typical modern bacteria is not more complex than its
> ancestor of 2Gy ago. The typical modern protozoa is not
> more complex than its ancestor of 1Gy ago. Or, if you
> focus on a particular species and niche, the modern
> cyanobacterium is not more complex than its ancestors.
>
>The theory appears to me to be not borne out by the facts.
>1. The leading edge taxa in the rise of complexity - the
> metazoa and the metaphytes - appear to have increased
> their complexity more within the last 250My than in the
> 250My before, and much more than in the 500My before
> that. Complexity increase among the already complex
> seems to be accelerating.
>2. There appears to be a kind of Cope's law of complexity -
> complexity tends to increase in almost all branches of a
> taxon.
>3. It seems clear to me that while natural selection may be
> an economic optimizer, it is not an optimizer of
> organizational structure. Successful organisms are much
> more complex than they absolutely need to be.
>4. You seem to believe that the simplest organisms have not
> increased much in complexity over the past 1-2Gy. I
> suspect that they may have doubled in complexity. Our
> conflicting intuitions may be testable using comparative
> genomics - though at the present time we don't have
> enough "branchiness" in our phylogenetic trees to
> reliably reconstruct models of the ancestors of modern
> microorganisms. The phenomenon of lateral gene transfer
> also creates problems. So, this disagreement may remain
> unresolved for a long time.
>
>For all of these reasons, I think that the phenomenon of
>the increase in complexity is a real one which deserves an
>explanation. However, I definitely do not believe, as some
>people seem to, that the explanation is to be found in the
>mathematics of "complex system dynamics" and will be
>revealed to us in the next book by Prigogine, Kauffman,
>Wolfram, or Chaisson.
>
>I think we already have an adequate biological
>understanding of why neoDarwinist evolution leads to
>increased complexity over time. We know that increased
>complexity arises because "duplication and divergence of
>function" is the mechanism that is available. This
>mechanism is used at both the level of the genes and at the
>level of morphogenesis. By contrast, there is no simple way
>for a working organism to shift to a lower complexity. To
>use the terminology of business administration, NS is just
>not very good at "re-engineering" the organism to, for
>example, find commonality between two functions and
>reorganize/simplify the way that functionality is
>delivered.
>
>So, how does my idea stand up against my criteria of
>coherence, usefulness, refutability, and correspondence
>with the facts? Well, I hope it isn't incoherent, but I
>must admit that it has a "shut down the debate" attitude
>that is unhelpful. I will leave it to more expert critics
>to evaluate it regarding refutation and verification.

I don't want to ignore you but I do have approximately 60
exams to finish grading and return tomorrow and another 70
for Wednesday. Your post demands some actual thought and
consideration (unlike most of what gets posted on news
groups). I'll try to get to it as soon as I can.
 
"Frank Reichenbacher" <[email protected]> wrote in message news:<c5m9ck$30f8
>
> How large do you think the niche space is, or how many
> niche spaces do you think there are, for sentient beings
> on this planet? Will the next sentient being have to
> develop some new structure or function (say telepathy) in
> order to out compete us, or find a new niche space?
>
Presently, as you know, the most successfull and, probably,
smartest people usually have few offsprings, and, at least
in the developed countries, the birth rate is much higher in
undereducated people with lower intellectual status. So, in
modern humans we, in fact, have a "devolution" where every
next generation is, on average, less "smart" than a previous
one. It may be objected by the Dawkins idea that for the
progress of the human civilization what really matters is
not a selection of the genes but the selection of memes or
ideas. But, as humans will biologically grow more and more
stupid, it may once appear that they simply have not got
sufficient brainpower to command the collected memes.
 
"Tim Tyler" <[email protected]> wrote in message
news:[email protected]...
> IRR <[email protected]> wrote or quoted:
>
> > > The bacteria are disadvantaged - since they can't
> > > easily cooperate
with
> > > one another, and build large structures - and such
> > > cooperation seems
to
> > > pay off.
> > >
> > > They have been (literally) overshadowed - and
> > > relegated to the nooks
and
> > > crannies of the world. These days much of the work
> > > gets done by macroscopic organisms - such as trees.
> > >
> > > Once bacteria ruled the world - but now they in the
> > > middle of a period of decline. Their decline seems
> > > likely to continue - as much of the world's chemical
> > > processing gets taken over by machines - who
will
> > > have stolen the bacteria's enzymatic secrets from
> > > their genomes.
> >
> > This anthropocentric misinformation that you've slipped
> > in at multiple points in this thread has no factual
> > basis and no support in real
science.
> > There have been at least half a dozen articles in just
> > the past few
years in
> > Science and Nature alone that point towards an inverse
> > scale free relationship between organism size and
> > organism density. That is, the
very
> > smallest organisms outnumber slightly larger organisms
> > by orders of magnitude, and outnumber even larger
> > organisms by many more orders of magnitude.
>
> Well, yes - becuase they are so tiny. I am outnumbered millions-to-
> one by my own gut bacteria - but that doesn't mean that
> they are more important than I am.

Certainly not what I meant to imply -- I'm not even sure if
Kolmogorov had a metric by which we could start to
categorize importance :). But I would maintain that the
enormous numbers of microorganisms that are out there argue
they aren't in any sort of period of decline. In fact in the
middle of this 6th mass extinction we've apparently
instigated, I'd say they are in better shape than anyone.

> > In fact the largest organisms on Earth (ourselves among
> > them) make up only a minute fraction of the global
> > biomass.
>
> A *lot* of the global biomass is in the form of
> trees - e.g.:
>
> ``The ongoing enrichment of the atmosphere with CO2 raises
> the question of whether growth of forest trees, which
> represent close to 90% of the global biomass carbon, is
> still carbon limited at current concentrations of close to
> 370 p.p.m.''
>
> ...most of which are much bigger than us.

Unfortunately these very dated statistics are only valid if
you're on Bush's Science Advisory Board. See for example
Whitman et. al's "Prokaryotes: The unseen majority" (Proc.
of the National Academy of Sciences, 95: 6578-83. 1998) for
a starter read and follow the trail of references therein.
Notice that this article is already 6 years old and so
predates recent major discoveries into the "deep subsurface"
biosphere (e.g. Lidy hot springs and similar studies) that
we've only scratched the surface of, and that may represent
an unseen microbial contingency larger than all combined
terrestrial life.

And of course I'd argue that the counting is already
biased -- every cell from every macroscopic organism has
anywhere from one to a few dozen mitochondria (and quite a
good number have chloroplasts) that were unjustly abducted
from the microbial domain and now count towards the
eukaryotic total!

> > Wipe out every multicellular organism on Earth and single-
> > celled life
will
> > barely take notice. Take out the single-celled microbial
> > contingency
and
> > every ecosystem on Earth will collapse in short order.
> > We are pawns in
a
> > microbial world.
>
> I'm currently totally dependent on my gut bacteria - while
> at least some of them could survive without me.
>
> ...but again, that doesn't mean my gut bacteria are more
> important than I am.
>
> Many - and maybe most - single-celled organisms have had
> their resources stolen by multi-cellular organisms.
>
> The landscape has thus shifted away from single-celled
> organisms and towards multi-cellular ones - and this shift
> will continue.
>
> In the future, there may still be many single-celled
> organisms performing many of the world's important
> chemical transformations - but they are likely to be
> sterile agents produced in factories - rather than unruly
> free agents.
>
> The factory is in a better position to do things like
> weave different disease resitance genes into every single
> organism it creates - and employ intelligent design in
> their production. process.
>
> Basically, those atttempting a free-living lifestyle - and
> only interfacing to the rest of the world economically -
> won't have the even the teeniest remote hope of survival
> in such a form - in the face of a huge global cooperative
> living organism competing for the same resources as them -
> and able to compete by using direct competitor organisms
> with no overhead of reproductive machinery, and the
> ability to utilise genetic engineering in their design.
>
> Life arose out of a microbial world - but the future lies
> with more communal organisms. Most of the bacteria of the
> world will not survive in into such a world (and
> numerically most of them are probably already gone).
>
> Many of the enzymatic discoveries in their genomes will
> live on, though - it will often be simpler to steal them
> than to reinvent them.
....]

Even though we have some outstanding evidence of eukaryotes
stealing genes from microbes (e.g. chloroplasts and
mitochondria, as mentioned above), all data seems to
indicate that most of the gene 'thievery', namely
horizontal gene transfer, goes on among and between
microorganisms. In fact, outside of a few noted examples
there's quite a bit of argument presently centered on
whether or not horizontal gene transfer is an important
mechanism in eukaryotic genetic innovation, having been
largely and effectively replaced

purports to change the face of this....