Opinion What bearing does "human nature" have on the outcome of capitalist-technocracy?

Tycho- Taylor's Version

Wild ferocious creature
Platinum Member
Joined
Mar 7, 2010
Messages
12,952
Reaction score
2,791
I'll admit up front that this thread was the result of watching Manhunt: Unabomber and then skimming ol' Ted's "manifesto".

Without getting into the details of that particular text - an idea worth talking about is whether there is such a thing as a static or slowly morphing "human nature" that is currently bearing the load of social and technological progress which has not made its fulfillment a top priority.

Here I see a few different possibilities:

1. Human nature is infinitely adaptable, or there is no enduring human nature (beyond that which is adapting).

2. Human nature is adaptable, and adapts quickly enough that it will always be compatible with future technological developments.

3. Human nature is adaptable, but will eventually be outpaced by technological change resulting in (a) disaster or (b) transcendence.

4. Human nature is static or minimally adaptable, and already under significant (perhaps even irreversible) threat from social and technological forces that are opposed to it.

I think differing opinions on this topic are starting to manifest themselves ideologically in mainstream thought. Technocrats vary between 1 and 3, with some "futurists" expecting human nature to be greatly transformed in the time to come. Others resist this idea within reasonable bounds, as in 2 or 3a/b, and at least one has mailed murderous bombs out of sheer urgency and concern over 4 (so stated).

I don't see any that are completely nuts, which means lots of conversations are to be had.

Which position would you say is the closest to being correct, and why?
 
Last edited:
What bearing does "human nature" have on the outcome of capitalist-technocracy?

3. Human nature is adaptable, but will eventually be outpaced by technological change resulting in (a) disaster or (b) transcendence.

Deep philosophical subject. I'll go with your number 3 above.

Human nature has and will always adapt. Technology is getting ready (continue) to do away with thousands of human jobs all over the world. You can blame that on AI (Artificial Intelligence). In another 50 to 100 years, there will be few jobs for humans to do. It should result in some form of disaster (a) for a good portion of mankind. First World countries will adapt. Third World countries will be a mess. We are kind of seeing that already.
 
I'll admit up front that this thread was the result of watching Manhunt: Unabomber and then skimming ol' Ted's "manifesto". Without getting into the details of that particular text - an idea worth talking about...
In summary, here is what the long 'manifesto' talks about:

"Technological society requires the breakdown of family and small-scale social groups. Workers must be able to move where they are needed. If people's first loyalty is not to society, they will form small groups that work to their own advantage against society."

"In many ways the internet is designed to never turn off. Finally, if the triumph of a technological takeover is the disaster that Kaczynski outlines – robbing souls of freedom, initiative, sanity, or the environment of its sustainability – and if this prison is inescapable, then the system must be destroyed."
 
Absooute power corrupts absolutely and for as long as humans have existed they have sought more power.
 
While human nature is adaptable, its clear that technology has been out-pacing our ability to adapt for much longer than people realize.

All one needs to do is look at our schools...all over the world.

Go back to when schools were being made into law. In several western nations that was in the mid 1800s. By the end of the 1800s, there were still parts of the west where it was not fully implemented through nations, meaning you could find small towns that had no schools and children in poor families still had to work for them to survive. it wasnt until child labor laws stopped that, that put all children into schools. Now look at the schools. Back then 15-20 children in a classroom...that is not the case today. We still have 1 teacher in a class, now those classes are over-flowing and its harder for teachers to help everyone that really needs it.

Then, there is the vast changes in schools. They used to be vocational based...now its general education based and that general education has evolved greatly with far more being taught today...yet, its all being taught virtually the same way if you exclude the retarded ass Common Core being taught in America (one of the leading factors in its school systems decline). Why? Why hasnt teaching evolved? We had no electricity when the system was created...no communication systems outside of the telegraph...

Case in point. back in the 1930s when television was being pushed, trying to convince people of its importance a pitchman said "Imagine this device in every home. Imagine for a moment the best and brightest in the world, say, Albert Einstein if you will...standing, in front of a black board looking into the camera and giving a science lesson to every American citizen. No longer will our children be trapped in a school, behind a desk and learning from an average person but instead a nation taught by the experts of experts in all fields.".

Almost 90 years later, televisions in every home...an internet that makes it even easier...nothing remotely like that exists. We are all still taught with the exact same format. There was even a push for change in the early days of Youtube...a guy started a channel called Kahn Academy and he had a new way to teach and, it got the backing of Bill Gates, he donated millions and helped to try to get it into a school district...and they did...and it worked...wonders. that District got a massive boost in grades and...nothing. Unions stopped it from spreading.

We have had laptops since the mid 1990s...we have had tablets for some 15 years...touch screens for 10+ years...schools have not evolved.

This is just one of 1000s of ways we have not adapted...I could write pages about how social media has wrecked our society from the individual all the way up to the entire media industry causing break-downs in peoples ability to communicate personally to the SJW syndrome to the virtue signalling that is so bad people are risking all their freedoms being taken away.

I believe our adaptability has limits, and we may have reached it...I also have zero doubt that if we invent a way to merge man and machine, eventually humans will cease to exist due to our own egos thinking copying our brain into electronic format means we will remain who we are.
 
I don't think humans can adapt to technology, it can only shape us.

Look at our degeneration physically as we become more sedentary. Under normal circumstances, those who lack adaptational fitness would die out. Now with the benefit of technology, even the most unfit specimens of humanity not only survive, but procreate and die old. That's not a sudden realization that increased our fitness, but technological advances that allowed us to mitigate that lack of fitness.

The story of human progression is the story of our development, advancement, and integration of technological advances into our society. The body itself is the final frontier of that progression. Artificial Intelligence is not the quest to create intelligence on par with humans, but to replicate the mechanism of human intelligence at scale. That is to say, to surpass human intelligence using ourselves as a blueprint, not the intended product.
 
I don't like the inevitability in 2 and 3, and thus I don't pick any. Don't love the phrasing either. I'd say that human nature is pretty static if you look at the deep structure of it, but it can manifest in very different ways depending on incentives in a society. But the incentive structure is also driven by people so I think there's a continuing process of tension and then bending (or not--it's certainly possible for societies to basically collapse, and we've seen it many times).

With regard to that structure, I pretty much follow Adam Smith (particularly The Theory of Moral Sentiments). This bit is particularly relevant:

As to love our neighbour as we love ourselves is the great law of Christianity, so it is the great precept of nature to love ourselves only as we love our neighbour, or what comes to the same thing, as our neighbour is capable of loving us.
 
From a purely biological perspective, I would have to say somewhere between 3a and 4, if I correctly interpret 1 through 4 as some sort of continuum.

The reason I feel that way is that, while we have large brains and observational learning, there is always going to be a genetic deterministic factor involved. Until we maybe get gene editing with CRISPR working. The reason I chose 3a is that there seems to be a repeating theme of "race to the bottom' with humans. That race to the bottom instinct will lead to irreversible environmental change and weapons that scale beyond what humans can fathom.
 
Number 1 is technically true but practically unimportant. May be useful as a...caveat...when pondering our supposed limitations.

I think number 2 is the closest and is evidenced by our adaptation to the Internet, which is still such a violent force that we have great difficulty evaluating its effects, but which we all embrace.

I'm not at all convinced by number 3 (it's religious superstition with a power cord), but I do think there's a possibility that a technological metaphysics and subsequent religion could develop. Already I'm more interested in why/how people believe we're living in a simulation than I am in the childish question itself, for example. Probably will be no big deal, but it could be the religion that ends us - stay tuned lol.

Number 4 is the most clearly ignorant and cynical view imo.
 
I don't think humans can adapt to technology, it can only shape us.

Look at our degeneration physically as we become more sedentary. Under normal circumstances, those who lack adaptational fitness would die out. Now with the benefit of technology, even the most unfit specimens of humanity not only survive, but procreate and die old. That's not a sudden realization that increased our fitness, but technological advances that allowed us to mitigate that lack of fitness.

The story of human progression is the story of our development, advancement, and integration of technological advances into our society. The body itself is the final frontier of that progression. Artificial Intelligence is not the quest to create intelligence on par with humans, but to replicate the mechanism of human intelligence at scale. That is to say, to surpass human intelligence using ourselves as a blueprint, not the intended product.

When people think of "human nature" nowadays I think a common tendency is to anchor it in biology.

I thought about evolution when I was writing the OP as well. But if technology is disrupting or opposing human nature, I don't think it's doing so by altering fitness or selection criteria - the reason being that those were always a product of environment, and what we live in now is an environment no more or less than it ever was.

The best you could do in my opinion is to say that technology has diversified or made unpredictable the relationship between our intuitive criteria for "success" and reproductive fitness. But that doesn't strike me as especially alarming, perhaps because a tight connection between the two doesn't seem like a necessary feature of "human nature" to me.

If I did believe in a human nature, I would consider AI a strong force pushing us toward 3(b). Human nature could be a precursor to a more powerful and enduring nature that is yet to come, though what will happen to us then (and how we'll feel about it) have yet to be determined.

In that sense I would also agree with this, especially the ego remark:

I believe our adaptability has limits, and we may have reached it...I also have zero doubt that if we invent a way to merge man and machine, eventually humans will cease to exist due to our own egos thinking copying our brain into electronic format means we will remain who we are.
 
Neither.

Humans are not very adaptable, true, but they are highly manipulable.

So in the end its not what the bulk of mankind wants, its what the bulk of mankind is told to want.
 
I don't like the inevitability in 2 and 3, and thus I don't pick any. Don't love the phrasing either. I'd say that human nature is pretty static if you look at the deep structure of it, but it can manifest in very different ways depending on incentives in a society. But the incentive structure is also driven by people so I think there's a continuing process of tension and then bending (or not--it's certainly possible for societies to basically collapse, and we've seen it many times).

With regard to that structure, I pretty much follow Adam Smith (particularly The Theory of Moral Sentiments). This bit is particularly relevant:

It's funny you mention this, because another piece of context for why this question arose is a book about the Enlightenment that I'm about a third of the way through. The chapter I just finished discussed how Enlightenment thinkers needed to navigate between two unpalatable options so that they'd be able to pull all people together into a single human society: the recognition of divinity or conformity to "natural law" in people on the one side, and mere selfishness (as in Hobbes) on the other. "Sentiment" was a tool that they used to find a middle ground - the Scottish thinkers especially (that being said - Smith still leant on divinity for a few major points).

Personally I think selfishness is probably more empirically justifiable, especially given how we demonstrated such an ability to shut off "sentiment" or "sympathy" toward others at multiple points in the 20th century. Perhaps the potential for those is always there, but the selfish will to survive will exist regardless of social incentives.

Not saying that it's the be-all-end-all for human nature either though.
 
From a purely biological perspective, I would have to say somewhere between 3a and 4, if I correctly interpret 1 through 4 as some sort of continuum.

The reason I feel that way is that, while we have large brains and observational learning, there is always going to be a genetic deterministic factor involved. Until we maybe get gene editing with CRISPR working. The reason I chose 3a is that there seems to be a repeating theme of "race to the bottom' with humans. That race to the bottom instinct will lead to irreversible environmental change and weapons that scale beyond what humans can fathom.
These are good thoughts.

If I was going to choose 1, or deny an enduring human nature altogether, it would be based on some combination of what you've mentioned.

Let's say the purpose of our biological inheritance, in the most abstract sense, is to transfer information from one generation of living thing to the next. Now let's say (and this is the controversial premise), that the information transfer we participate in now is primarily cultural as opposed to biological. So much so that receptiveness to cultural information over biological information becomes a selection criterion as generations proceed.

This would be pretty huge in terms of "skyhooking" us out of our biological constraints. Furthermore, if some of the cultural information we transmit involves modifying our biological information, it's hard to see how we would then be beholden to human nature in any way (or we would have to define it as something completely independent from biology).

The destructiveness of "racing to the bottom" sucks even more in light of those possibilities.
 
Number 1 is technically true but practically unimportant. May be useful as a...caveat...

Muah.

I think number 2 is the closest and is evidenced by our adaptation to the Internet, which is still such a violent force that we have great difficulty evaluating its effects, but which we all embrace.

I'm not at all convinced by number 3 (it's religious superstition with a power cord), but I do think there's a possibility that a technological metaphysics and subsequent religion could develop. Already I'm more interested in why/how people believe we're living in a simulation than I am in the childish question itself, for example. Probably will be no big deal, but it could be the religion that ends us - stay tuned lol.

Number 4 is the most clearly ignorant and cynical view imo.
My concern in considering number 4 is that aspects of our theory of "human nature" tend toward excessive universalizing, when what we actually observe is closer to segmentation. As @Jack V Savage had mentioned, if variable incentive structures produce different manifestations of "human nature" (is that meaningfully different from there not being one?) and our structures tend in a particular direction (liberal democracy, let's say) then a fraction of the existant population will be fundamentally unsuited to the technocratic society being built.

Arguably that's the fractionation we're seeing all over the place today in our "clash of civilizations". What I don't understand is the alternative vision of the future the "lagging" populace wants to put forward. Ol' Ted wanted a "return to nature" of sorts, but it seems to me that adopting such a philosophy leaves you unprepared to protect that nature against the most devastating possible threats.

@Seaside, that's your cue.
 
Personally I think selfishness is probably more empirically justifiable, especially given how we demonstrated such an ability to shut off "sentiment" or "sympathy" toward others at multiple points in the 20th century. Perhaps the potential for those is always there, but the selfish will to survive will exist regardless of social incentives.

Not saying that it's the be-all-end-all for human nature either though.

Smith gets into that. For example:

This disposition to admire, and almost to worship, the rich and powerful, and to despise or, at least, neglect persons of poor and mean conditions, though necessary both to establish and to maintain the distinction of ranks and the order of society, is, at the same time, the great and most universal cause of the corruption of our moral sentiments.

I don't think it makes any more sense to describe people as generally selfish as it does to describe us as generally tall. If it just means "more selfish than most people think," I would say that is very wrong. People want "stuff" and power in large part because we want to be loved and worthy of being loved (Smith uses "loved" in a particular way) and that's a path to it in our society. In differently structured societies, that same desire (in excess) leads to people giving everything away so everyone is in their debt. The way we have it set up is better for encouraging economic growth and thus raising living standards so I wouldn't support changing it totally, but I think that it's important to remember that people aren't inherently infinitely acquisitive.

And to illustrate the point about selfishness, think of someone who runs into a burning building to save a kid. Someone who describes that behavior as "selfish" as some would (done only to maximize self-esteem maybe, or to get the respect of the community) is just revealing something about his own worldview and not making a predictive, valuable statement about humans act. The trick for society is to design incentives for behaving in a way that maximizes the well-being of residents. It tends to be an evolutionary process rather than a deliberate one, of course (recognition of that is the basis of intelligent conservatism).
 
When people think of "human nature" nowadays I think a common tendency is to anchor it in biology.

I thought about evolution when I was writing the OP as well. But if technology is disrupting or opposing human nature, I don't think it's doing so by altering fitness or selection criteria - the reason being that those were always a product of environment, and what we live in now is an environment no more or less than it ever was.

The best you could do in my opinion is to say that technology has diversified or made unpredictable the relationship between our intuitive criteria for "success" and reproductive fitness. But that doesn't strike me as especially alarming, perhaps because a tight connection between the two doesn't seem like a necessary feature of "human nature" to me.

If I did believe in a human nature, I would consider AI a strong force pushing us toward 3(b). Human nature could be a precursor to a more powerful and enduring nature that is yet to come, though what will happen to us then (and how we'll feel about it) have yet to be determined.

The reason why I chose to focus on biology is that it's one of the few things we can point to as evidence of incontrovertible changes. The environment can only go so far toward explaining the development of human society, it's the biological shifts that occur as a result of those developments that affect our society and it's primary drivers. For instance, on the topic of fitness, consider agriculture as it developed in the Fertile Crescent. The prevailing theory is that we changed gears as a result of population increases (particularly in density). Where we previously would have corrected back to equilibrium as less fit individuals died out, technology that increased food yield not only increased the overall fitness of tribes, but left time for the advancement of pursuits that had nothing to do with basic subsistence. People lived longer, more knowledge was transferred, and it served as the foundation for civilization as we know it.

Were it not for that external biological pressure (hunger), we had no incentive to press forward with the development of stationary population centers. It is only through the advent of farming techniques that we enabled ourselves to survive long term in defiance of typical notions of fitness. Technology allows us to surpass biological limitations, that's the role it plays in the development of our society. The further ahead we get from biology, the more we expose the limitations of human nature, physical or otherwise. Biology is the marker that determines the development of our society, and bending it to our will is an essential piece of the process if sustainability is on the agenda.

From that perspective, Technology is a correctional mechanism used to tamper the drawbacks foisted on us by Biology. AI will do the same, only for abstract thought previously restricted to the arcane realm known as the human brain.
 
These are good thoughts.

If I was going to choose 1, or deny an enduring human nature altogether, it would be based on some combination of what you've mentioned.

Let's say the purpose of our biological inheritance, in the most abstract sense, is to transfer information from one generation of living thing to the next. Now let's say (and this is the controversial premise), that the information transfer we participate in now is primarily cultural as opposed to biological. So much so that receptiveness to cultural information over biological information becomes a selection criterion as generations proceed.

This would be pretty huge in terms of "skyhooking" us out of our biological constraints. Furthermore, if some of the cultural information we transmit involves modifying our biological information, it's hard to see how we would then be beholden to human nature in any way (or we would have to define it as something completely independent from biology).

The destructiveness of "racing to the bottom" sucks even more in light of those possibilities.

I think it's a difficult premise to accept - that the function of biological inheritance is the transfer of information. We can use thought experiment here. Imagine somehow an entire next generation is born and, during their infancy, somehow everyone from previous generations dies. In this instance, the youths somehow mature into adulthood, but there is no transfer of cultural information. All of their behavior would be organic. Obviously this ins't really reflective of what happens, but it would present a situation in which that supposition does not hold.

As far as the idea that receptiveness to cultural info is a selected trait, I think you end up with a chicken-and-egg thing. Even before agriculture, even before Man spread outside of his homeland, intergroup behavior was probably a major sexual selection trait. Furthermore, I think biological information (assertiveness, social intelligence, empathy, etc.) sort of "pre-build" the culture itself so it is very difficult to unpack them from each other.

The observation that our brainpower let's us overcome our biological constraints, even with everything I have spat out so far, is certainly something we observe. Whatever the root cause may be. But simultaneously, as I outlined, maybe it IS tied to our innate nature to build these things.

I don't know if I am wording this correctly, but basically I don't think we can very easily separate nature vs nurture, culture as shaping biology or vice versa, or how much of what we do is really "natural".
 
Muah.


My concern in considering number 4 is that aspects of our theory of "human nature" tend toward excessive universalizing, when what we actually observe is closer to segmentation. As @Jack V Savage had mentioned, if variable incentive structures produce different manifestations of "human nature" (is that meaningfully different from there not being one?) and our structures tend in a particular direction (liberal democracy, let's say) then a fraction of the existant population will be fundamentally unsuited to the technocratic society being built.

Arguably that's the fractionation we're seeing all over the place today in our "clash of civilizations". What I don't understand is the alternative vision of the future the "lagging" populace wants to put forward. Ol' Ted wanted a "return to nature" of sorts, but it seems to me that adopting such a philosophy leaves you unprepared to protect that nature against the most devastating possible threats.

@Seaside, that's your cue.
I'm having trouble separating human behavior ("human nature" is just human behavior) from social and technological progress, which is a function of human behavior. I think it's therefore just true by definition that any threat is within the realm of human behavior, and that human nature is therefore adaptable to exactly the degree in which we progress, as that has always been true. So I guess it boils down to fear of the past, present, and inevitable. And so I have to say on principle that it's not a valid concern, except that we know there will be problems and conflicts. But not special problems that are going to break everything just because we're making progress.

But this conflicts with my observation that the Internet has been a wildly strong and unpredictable force in terms of changing "human nature," and that we don't know most of its effects yet. We should keep an eye on things like crime, income, etc. just like we would for evaluating any other point of progress, and so far there's no reason to think humanity will be doing worse because of the recent tech changes.

Basically, history and reasoning refutes #4, so it would have to be a new technological threat to the social order that works fundamentally differently from older threats, and I see no evidence of that.
 
Human nature is to be free, follow one's heart, and to openly share what one finds. THat doesn't happen when from the moment you are born you are forced to conform into a fraudulent, artificial identity given to you. Our society claims to love diversity, but it's literally annihilating it to create a single standard of human expression. And we grow weaker, more dependent on the shallow external, and more artificial with each passing day.
 
Human nature is to be free, follow one's heart, and to openly share what one finds. THat doesn't happen when from the moment you are born you are forced to conform into a fraudulent, artificial identity given to you. Our society claims to love diversity, but it's literally annihilating it to create a single standard of human expression. And we grow weaker, more dependent on the shallow external, and more artificial with each passing day.
But I bet you won't give up roads and running water and the Internet.
 

Forum statistics

Threads
1,236,582
Messages
55,428,344
Members
174,775
Latest member
shawn_bogart
Back
Top