Today's lecture focuses on Chaos theory. The assigned book is Chaos by James Gleick. Part of what's analyzed is reductive science, which is basically the concept that we can dig deeper and to ever smaller portions of a thing and ultimately gain knowledge about that thing. So we can go from saying people have feelings, to people have limbic systems to people have neurotransmitters and on down the line and at each level we come closer to the fundamental building blocks. These blocks are then believed to be consistent - figure them out scientifically and you can reproduce the results. Part of chaos theory is that there is no end to the potential for reducing (think quarks) and that at a certain point we hit the Heisenberg Uncertainty Principle and end up with randomness. As he goes through the lecture, prior themes will come to mind, such as the earlier points about the frontal cortex, the most complex part of humans, being the least constrained by genes.
Genes are reductionism. Jumping genes, transcription factors, epigenetic influences, etc. are chaos like in that fundamental patterns are altered in unpredictable ways, or at least in ways that aren't controlled and determined in the traditional sense.
Thomas Aquinas. 3 things God cannot do.
1. Sin.
2. Make a copy of Himself.
3. Make a triangle with more than 180 degrees.
Science over religion. The universe as ordered with absolutes. And we have the introduction of reductionism. Understand a complex system by breaking it down into its parts. Understand those and you get the whole.
This is core to science.
Linearity. Additivity. Add component parts together and you can produce the end result. If you know the starting state you can figure out what the end result will be.
And if it's reductive, then there's a blueprint that points the system toward what it should end up looking like.
Significantly, the variability that emerges in data is viewed as junk, noise, instrument error something to be gotten rid of. And the thinking is that the way to get rid of it is to be more reductive; the closer you get, the less variability there should be. Eventually you should be able to measure the true, iconic norm.
In Chaos, Gleick points out that hard to measure systems were basically ignored and considered to be unscientific. In HumBio, think back to the heritability segment - science reduces down to one controllable variable in the lab, gets results and then calls those scientific truth. A lot of room for inaccuracy there since real systems are much more likely to be variable.
The human body thus goes down levels. Body, organs, cells, etc.
But it doesn't work this way for everything. Hubel and Weisel - theory of individual grandmother neurons, dot, line, curves...The thinking being that one neuron stores one thing, from simple to complex.
But the cortex seems to work in systems and networks.
Bifurcating systems. Scale free. All the branch points on neurons are bifurcating (dendritic trees). The circulatory system is also bifurcating. As is the pulmonary system.
Just like with neurons, there aren't enough genes to code for the bifurcating system gene by gene. It cannot be a reductive, point for point solution.
Chance. Brownian motion. Cellular material differs from the first division.
The takeaway is that the most interesting stuff can't be regulated in a simple, reductive way.
Determinist + Periodic.
Determinist + Aperiodic. This is where our waterwheel comes in. It's not ostensibly linear, but it is periodic; the pattern is simply complicated.
Non-determinist (random elements).
Chaotic - a pattern that never repeats. When the amount of force added crosses a threshold, it goes from a periodic or aperiodic pattern to one that no longer has a repeating, observable structure. The magic number seems to be 3; have 3 distinct patterns on a repetitive structure and you're closing in on a chaotic system.
With these strange attractors, the pattern doesn't really repeat - somewhere at that millionth decimal mark, there's a minor change which in turn leads to a slightly different next value. These differences amplify with each new value; this is the so called butterfly effect (marginal impact of the wings changes the environment slightly...)
Fractal - information that codes for a pattern and has similar features to the prior elements with the same type of complexity and variability. Think bifurcations.
Thus science encounters the problem that variability is the system and the only way to produce accurate, true data is to include "noise." Reductive approaches can still be very effective, the data just won't reflect an absolute reality.
Genes are reductionism. Jumping genes, transcription factors, epigenetic influences, etc. are chaos like in that fundamental patterns are altered in unpredictable ways, or at least in ways that aren't controlled and determined in the traditional sense.
Thomas Aquinas. 3 things God cannot do.
1. Sin.
2. Make a copy of Himself.
3. Make a triangle with more than 180 degrees.
Science over religion. The universe as ordered with absolutes. And we have the introduction of reductionism. Understand a complex system by breaking it down into its parts. Understand those and you get the whole.
This is core to science.
Linearity. Additivity. Add component parts together and you can produce the end result. If you know the starting state you can figure out what the end result will be.
And if it's reductive, then there's a blueprint that points the system toward what it should end up looking like.
Significantly, the variability that emerges in data is viewed as junk, noise, instrument error something to be gotten rid of. And the thinking is that the way to get rid of it is to be more reductive; the closer you get, the less variability there should be. Eventually you should be able to measure the true, iconic norm.
In Chaos, Gleick points out that hard to measure systems were basically ignored and considered to be unscientific. In HumBio, think back to the heritability segment - science reduces down to one controllable variable in the lab, gets results and then calls those scientific truth. A lot of room for inaccuracy there since real systems are much more likely to be variable.
The human body thus goes down levels. Body, organs, cells, etc.
But it doesn't work this way for everything. Hubel and Weisel - theory of individual grandmother neurons, dot, line, curves...The thinking being that one neuron stores one thing, from simple to complex.
But the cortex seems to work in systems and networks.
Bifurcating systems. Scale free. All the branch points on neurons are bifurcating (dendritic trees). The circulatory system is also bifurcating. As is the pulmonary system.
Just like with neurons, there aren't enough genes to code for the bifurcating system gene by gene. It cannot be a reductive, point for point solution.
Chance. Brownian motion. Cellular material differs from the first division.
The takeaway is that the most interesting stuff can't be regulated in a simple, reductive way.
Determinist + Periodic.
Determinist + Aperiodic. This is where our waterwheel comes in. It's not ostensibly linear, but it is periodic; the pattern is simply complicated.
Non-determinist (random elements).
Chaotic - a pattern that never repeats. When the amount of force added crosses a threshold, it goes from a periodic or aperiodic pattern to one that no longer has a repeating, observable structure. The magic number seems to be 3; have 3 distinct patterns on a repetitive structure and you're closing in on a chaotic system.
With these strange attractors, the pattern doesn't really repeat - somewhere at that millionth decimal mark, there's a minor change which in turn leads to a slightly different next value. These differences amplify with each new value; this is the so called butterfly effect (marginal impact of the wings changes the environment slightly...)
Fractal - information that codes for a pattern and has similar features to the prior elements with the same type of complexity and variability. Think bifurcations.
Thus science encounters the problem that variability is the system and the only way to produce accurate, true data is to include "noise." Reductive approaches can still be very effective, the data just won't reflect an absolute reality.