Is a Neuropath future inevitable and/or unavoidable?

  • 13 Replies
  • 23413 Views

0 Members and 1 Guest are viewing this topic.

BeardFisher-King

  • *
  • Old Name
  • *****
  • Posts: 599
  • The 8-Trackless Steppe
    • View Profile
« on: April 27, 2017, 08:24:08 pm »
I find what I could stand to read of Neuropath and Crash Space extremely disturbing and unnerving. Don't like reading them. At all.
I think that's his point- we don't want to face our future, but here it is!
I get that, but I disagree that our future is inevitably headed in that direction. :)
"The heart of any other, because it has a will, would remain forever mysterious."

-from "Snow Falling On Cedars", by David Guterson

Wilshire

  • *
  • Administrator
  • Old Name
  • *****
  • Enshoiya
  • Posts: 5936
  • One of the other conditions of possibility
    • View Profile
« Reply #1 on: April 28, 2017, 12:34:55 am »
Define inevitable.

Do you mean that no matter what we try to do, individually or collectively,  that Neuropath and Crash Space will happen?
I don't think so.

But, if you mean simply that if the science and technology as well as current trends continue unchecked and  unhindered, those are potential realities? I absolutely think so.
One of the other conditions of possibility.

Francis Buck

  • *
  • Kcub Sicnarf
  • Kijneta
  • *****
  • The Lordlady
  • Posts: 273
  • Philosoraptor
    • View Profile
« Reply #2 on: May 16, 2017, 06:27:21 am »
Define inevitable.

Do you mean that no matter what we try to do, individually or collectively,  that Neuropath and Crash Space will happen?
I don't think so.

But, if you mean simply that if the science and technology as well as current trends continue unchecked and  unhindered, those are potential realities? I absolutely think so.

Agreed, particularly the last bit.

At this stage, I personally view most of Bakker's fiction as warnings of the negative (usually disastrous) consequences of the Information Age/Anthropocene or Transhumanist/Dataist movement - whatever we call "this era happening right now" - rather than declarations of the inevitable. In other words, he writes cautionary tales regarding these topics and issues.

Quoting directly from Wikipedia because I'm lazy:

Quote
Like horror fiction, generally the cautionary tale exhibits an ambivalent attitude towards social taboos. The narrator of a cautionary tale is momentarily excused from the ordinary demands of etiquette that discourages the use of gruesome or disgusting imagery because the tale serves to reinforce some other social taboo.

Cautionary tales are also frequently utilised to spread awareness of moral issues, and for this reason are often told to children to make them conform to rules that either protect them or are for their own safety.

Obviously Bakker is writing such tales for adults more than children, and the theme of "unknown unknowns" that runs through his work makes such categorization a bit more nuanced. But, I think one can approach his stories the way one might approach, say, 1984 by George Orwell. It's meant to be taken quite seriously, but not to the degree that we literally prepare for an actual political-entity-organization-thing called Big Brother, just something (or things) similar to it.

Thus, we today should be prepared, to the extent of our ability, for people/beings/entities like a Neuropath, or a Kellhus, and be aware of the potential disasters from technologies like those seen in Crash Space.

As for my own personal view on where reality will take us, for whatever that's worth, I generally fall somewhere in the middle. I don't really think our entire society as we know it will rapidly collapse in an apocalyptic fashion within the next, whatever, 20 years (I admit it feels weird saying this in a post-Trump Administration America, but my larger-scale view of things has not really changed that dramatically). Instead I lean more toward it being a perceptually gradual shift for most of the world, albeit one that is lightning fast by historical standards.

That being said, if I were to bet a million dollars, I'd suggest that by the end of the 21st century a significant portion of the world's population will look back at humanity as we are now and realize they are no longer the same animal -- and unfortunately I don't see that process being a perfectly smooth progression for people in general. Persecution and bigotry are not so easily solved, especially the latter. It seems very likely that under such circumstances, a considerable chunk of the population will be "left behind", for lack of a less loaded phrase, and I would imagine that group will include those both willingly and unwillingly, depending on the context. And that may well be for the better. Who can say?

That is, after-all, what makes this point in history uniquely challenging. No one knows how information technologies will alter our society even five years from now. We can make educated guesses, and probably get a decent idea, at least for now. But make no mistake, once/if we find ourselves co-existing with entities possessing an intelligence even moderately superior to our own, all bets are off. It is, quite literally, impossible to comprehend the nature of such things

*I only omit TSA from the rest of RSB's oeuvre because it's just so damn huge and covers so many topics and issues, many of which are older than God. Which is not to discredit the more speculative elements; I actually think one of the bigger "under-sung" achievements of the series is, IMO, the virtually unparalleled depiction of what a greater-than-human-intellect such as a Dunyain might actually be like.

The series basically ruined superhuman intelligences in other fiction for me, at least any that try to "get inside the head" of such a thing. 

ETA: Actually, I'll amend that last statement -- Peter Watts deserves a mention for his portrayal of creatures smarter (by large or small margin) than humans. His approach is different and he certainly hasn't been spending 30 years writing a character like Kellhus, but he's definitely circling around a similar array of ideas, and he does it with more finesse than basically any other non-RSB author I've personally read.


« Last Edit: May 16, 2017, 06:44:14 am by FB »

TaoHorror

  • *
  • Old Name
  • *****
  • Posts: 1152
  • whore
    • View Profile
« Reply #3 on: May 19, 2017, 06:16:12 pm »
My thoughts:

Books like 1984 and Herbert’s stuff are not meant to be future telling/warnings/clairvoyant, but preventative medicine for humanity. Because 1984 has entered our general awareness, now it likely won’t happen. Herbert “warns” of the dangers of artificial intelligence so now that we’ve read his stuff, the effort in creating free willed artificial consciousness without a plug will be approached with greater caution ( if such a thing is even possible – note Void Ship “stumbles” on creating such a thing, nodding to evolution’s greatest gifts: circumstance, error and the “motivation” of the living to thwart terrifying/threatening circumstances – remember it’s the same people trying to come up with AI in each iteration, but one group finally succeeds regardless of operating in same circumstances of the previous attempts ). That and the potential callousness of "using" clones as tools and not treating them as "real" people ( another attempt to ward against a potential future evil ).

Not sure Neuropath fits the bill as either foretelling a frightening future or as deterrence – more like how mad it could be for us to discover “there’s nothing at the bottom of the bag” – I take it as story on how that revelation could impact the scientists who initially discover the truth of that ( Neuropath suggests they could go mad ). Could be a warning, though – better keep a close eye on those pursuing this study perhaps?

In reference to Trump … here’s holding up a glass to America’s checks and balances government caging that crazy rooster haired fucker … ( read: hoping! ).

Your “middle” prediction is interesting – take it that’s how you’re explaining your take that the future will likely not be sewn per any one person’s vision? If so, then I’d agree. This is one of the best contributions of SciFi – the ability to measure our imaginations of the future. We still haven’t made it past the moon ( as 2001 would have us believe ), but our advancements in digital technology has surpassed all visions ( compare Alien with Prometheus ) and simply networking the planet has yielded a surprising present day ( even Dan Simmons had to “update” how his AI in his Hyperion story was created, nice break for him as he didn’t go into as of yet in the first book ).

I guess you’re right, no way of knowing how we’ll “react” to a “higher” intelligence, but more so due to the improbability that there are out there ( or at least within walking distance in the cosmos ). But if it did happen, think of the "superior" alien intellect visiting Earth - the movie Arrival did a decent job of addressing the complexity of such an interaction, let alone allowing the alien to manipulate us.

Regarding your "superior intellect" point, a message of the book is to be wary of those you perceive as being more intelligent than yourself … could be you’re just looking at them from the wrong angle or you're mistaking the "foreign" as superior; lack of understanding or differences in capabilities does not denote better or worse ( someone "smarter" than you may not have as good judgement or can perform under stress as yourself ). So consider not letting the works you mention “ruin” stories about “super human intellect” … it’s still something new waiting to be discovered.
« Last Edit: May 19, 2017, 06:34:05 pm by TaoHorror »
It's me, Dave, open up, I've got the stuff

jamesA01

  • *
  • Guest
« Reply #4 on: July 13, 2017, 12:28:25 am »
If it's not then we're all failures.

Wilshire

  • *
  • Administrator
  • Old Name
  • *****
  • Enshoiya
  • Posts: 5936
  • One of the other conditions of possibility
    • View Profile
« Reply #5 on: July 13, 2017, 11:30:38 am »
If it's not then we're all failures.
Welcome back :) .
One of the other conditions of possibility.

jamesA01

  • *
  • Guest
« Reply #6 on: July 27, 2017, 11:59:04 pm »
Good to be back, this place looks a lot better now.

The tech in Neuropath/Crash Space is exactly the kind of tech we need to STOP johnny serial killer and muhammed el boom boom before they go loco. Sorry if it doesn't respect hallucinatory religious notions of a transcendental self and it's holy agency. The buddha is laughing at you while he reads a thinkpiece on how diets rich in rice seem to produce less individualistic cultures. It's all material folks.

We always were stupid violent gene replicating trash and we've always been doing the kind of evil shit depicted in those novels to each other and covering it over and getting away with it. Being able to lock someone up after the fact is enough to stop them doing it again, but it's hardly good enough.

Yeah, I know this tech is neutral, it's gonna be used for good and bad. No, I am not implying that it will only be used for good. But the faster we get there the better IMO.

It all depends on how good you think current humanity is. I think we're pieces of shit (I, we, you) and our history bears that out, and even the best of us are still pretty horrible.

Wilshire

  • *
  • Administrator
  • Old Name
  • *****
  • Enshoiya
  • Posts: 5936
  • One of the other conditions of possibility
    • View Profile
« Reply #7 on: July 28, 2017, 12:45:47 pm »
Hope you stick around for a while. And thanks, its taken some work but I'm glad some people like the new look.

I'm with you.

Recently read, Madness probably gave me the link, something about how post-humanism will likely not start on Earth. Reasoning basically because there's too much cultural inertia and too little incentive. Too much excess for those who could develop and implement it to drive them to do it, then throw in corrupt religion and politics and you're sunk.

No, the post-humanist movement will occur in frontier societies whose need to enhancements will take precedence over millennia of historical baggage. (pie-in-the-sky: ideally that'd be Mars/moon/lagrange colonies, far enough away and far enough along to avoid a catastrophic failure of Earthicans)

I largely agree with this sentiment. Then, fast forward a few decades/centuries and our post human brethren will come back to earth and wonder how such simple animals could still be in charge.

I just have so little faith in 'humanity' as a whole. Better to chop it up into isolated segments and have the creme rise, though I don't think that's truly possible in the society we live in today.


Though I'd like to contradict myself as well. I think if some progress is made in the avenues of Crash Space / Neuropath, it'll be controlled here by small groups of powerful people. The rich and well connected circumventing laws and genetically enhancing their children directly. Why bother paying for something so ethereal as 'better schools' when you can concentrate your assets to directly benefit yourself and your offspring. This divide, imo, would progress rapidly after it began, and create a kind of permanent divide between the top and bottom.

After all, people are terrible creatures. That we've made it this far on our own is a miracle, I just hope we don't hit the self-destruct button before securing ourselves or our progeny a place to live safely once that happens. (I like Kakku's idea that there are no visible type 1 alien civilizations because they all blow themselves up before they get there).



« Last Edit: July 28, 2017, 12:47:39 pm by Wilshire »
One of the other conditions of possibility.

H

  • *
  • The Zero-Mod
  • Old Name
  • *****
  • The Honourable H
  • Posts: 2893
  • The Original No-God Apologist
    • View Profile
    • The Original No-God Apologist
« Reply #8 on: July 28, 2017, 01:04:37 pm »
Though I'd like to contradict myself as well. I think if some progress is made in the avenues of Crash Space / Neuropath, it'll be controlled here by small groups of powerful people. The rich and well connected circumventing laws and genetically enhancing their children directly. Why bother paying for something so ethereal as 'better schools' when you can concentrate your assets to directly benefit yourself and your offspring. This divide, imo, would progress rapidly after it began, and create a kind of permanent divide between the top and bottom.

An EPS cycle, perhaps?  I mean, probably not exactly.  In fact, perhaps more of a E-S-P cycle, where once the elite have proven it effective, specialized private sectors will get involved (looking first, of course at military uses) then eventually trickling down to popular use.
I am a warrior of ages, Anasurimbor. . . ages. I have dipped my nimil in a thousand hearts. I have ridden both against and for the No-God in the great wars that authored this wilderness. I have scaled the ramparts of great Golgotterath, watched the hearts of High Kings break for fury. -Cet'ingira

Wilshire

  • *
  • Administrator
  • Old Name
  • *****
  • Enshoiya
  • Posts: 5936
  • One of the other conditions of possibility
    • View Profile
« Reply #9 on: July 28, 2017, 03:54:56 pm »
Unless it goes full nonman/emwama dichotomy, that's pretty reasonable. I'm just of the mind that those who start will have an insurmountable lead.

Like if you discover an AI/machine leaning/algorithm to beat the stock market. Yeah, other people will get it at some point, but the longer it's just you, the more assets you'll have accumulated, and those compound to the point where you're functionally infinetly ahead... clunky analogy. Point being, will the headstart rapidly become insurmountable.
One of the other conditions of possibility.

H

  • *
  • The Zero-Mod
  • Old Name
  • *****
  • The Honourable H
  • Posts: 2893
  • The Original No-God Apologist
    • View Profile
    • The Original No-God Apologist
« Reply #10 on: July 28, 2017, 04:03:58 pm »
Unless it goes full nonman/emwama dichotomy, that's pretty reasonable. I'm just of the mind that those who start will have an insurmountable lead.

Like if you discover an AI/machine leaning/algorithm to beat the stock market. Yeah, other people will get it at some point, but the longer it's just you, the more assets you'll have accumulated, and those compound to the point where you're functionally infinetly ahead... clunky analogy. Point being, will the headstart rapidly become insurmountable.

Yeah, not questioning that.  Not to mention that what would end up becoming "popular" would probably be the 'trail-end' stuff.

Kind of how everyone has a computer now, but not everyone has a Cray.
I am a warrior of ages, Anasurimbor. . . ages. I have dipped my nimil in a thousand hearts. I have ridden both against and for the No-God in the great wars that authored this wilderness. I have scaled the ramparts of great Golgotterath, watched the hearts of High Kings break for fury. -Cet'ingira

Madness

  • *
  • Administrator
  • Old Name
  • *****
  • Conversational Batman
  • Posts: 5275
  • Strength on the Journey - Journey Well
    • View Profile
    • The Second Apocalypse
« Reply #11 on: July 28, 2017, 06:50:21 pm »
Good to see you back around, jamesA01!
The Existential Scream
Weaponizing the Warrior Pose - Declare War Inwardly
carnificibus: multus sanguis fluit
Die Better
The Theory-Killer

solipsisticurge

  • *
  • Momurai
  • **
  • Posts: 79
  • There is a head on a pole behind him.
    • View Profile
« Reply #12 on: August 04, 2017, 08:03:23 am »
Short-term profitability will decide this over any cultural resistance to the ideas. Just as we stand on the precipice of the death of meaning and purpose, so we witness, in our meager ways, the final days of human relevance to the only system that still matters (perpetual economic growth). Once algorithms and machines cross the point of no return in terms of outperforming humanity at sufficient tasks, our humanist ideologies will crash beneath their monetary weight. AI with their core code set to the accumulation of wealth and dominance of their market will make the vast majority of purchases (from a GDP perspective if not by number of individual transactions), largely removing even the need for human customers.

(Side note: that would make a nice story idea, a futuristic Earth where a handful of extremely powerful economic algorithms/AI viciously attempt to outcompete each other, a never-ending game of ruthless seven-dimensional economic chess, with products produced at ever-increasing scale, bartered back and forth, with no humans left who actually need or desire them. One AI corners the market on disposing of all the useless junk, gradually achieves total monopoly over all industries, wins the war. Begins attempting to eliminate itself as a competitor.)

The free world has shown repeatedly that economic reward will trump humanist notions most of the time. This is aided by relative reaction speed difference, as governmental and social institutions are much slower to reach consensus/make decisions than purely economically driven ones. Thus will all humanist institutions and reservations eventually fail, if only because all those holding them are far easier to starve out than those playing along.
Kings never lie. They demand the world be mistaken.

Wilshire

  • *
  • Administrator
  • Old Name
  • *****
  • Enshoiya
  • Posts: 5936
  • One of the other conditions of possibility
    • View Profile
« Reply #13 on: August 15, 2017, 01:52:13 pm »
solipsisticurge, I love the discussion about economics, and I pretty much agree. Money is the only thing that matters to society, or at the very least it moves so fast that it makes irrelevant everything else. Churches don't ask for donation and hide their accumulated assets from view for no reason. I wonder if they've been commissioning quantum computers and tech employees to build them strong AI ;) .

Money properly invested makes money faster than anything else. The vast majority of transaction on the NYSE are done by computers running algorithms. The companies that own them literally fight to be physically closer to the NYSE servers because the speed of light over fiber optic cables inhibits how fast their computers can execute orders. Futures/Derivative markets that trade on imaginary moneis using leveraged accounts bought the NYSE (look up ICE) as their accumulated assets are worth hundreds of times more than any real world equivalent. Wealth, money, is not tied to anything physical and hasn't been for decades. Markets are built to extract money from systems for the purpose of making more ,faster, without the need for messy humans.

So, by and large, the world you've describe already exists today. It'll be a matter, probably sooner rather than later, of humans trying to keep up with the runaway wealth of roboinvestors trading in derivatives markets worth thousands of times more than the wealth of any real assets.

I love your scifi idea. I'd buy that book :D .
One of the other conditions of possibility.