Page 8 of 10 FirstFirst ... 678910 LastLast
Results 106 to 120 of 140
  1. #106
    Quote Originally Posted by ahigherway View Post
    I disagree. I don't think war helps us. But a crisis can certainly help. If man were stripped of his warlike tendencies, he'd be forced to look for other solutions to problems.

    There's lots of talk today about going to war with Iran. But if enough people speak up against the war-drums, diplomacy could win out.

    2001 was kind of on the theme of the "continuation" of war-like tendencies within man. Personally, I don't think it's man's destiny to be warlike. On the contrary, I think man's war-tendency is a type of "short-circuit" response.


    brian
    while i agree that we aught to abandon war at least amongst ourselves, its naive to think that the entire species will comply.

    and while i do agree that too much wars should be avoided simply from a moral point of view, i think it would be naive to discount how wars have allowed us to make leaps and bounds technologically. the factories that churned out 1000s of tanks in a short period of time gave us unprecedented insight into industrialization and production. the mass budgets into flight and rocketry during the second world war and the cold war made us make leaps and bounds that would be naive to think would have happened even without these wars in such a short period of time. the internet we use right now was a result of the military.

    now still the millions dead and the fact that we can now set our civilization back 100s or 1000s of years back within hours still leads me to believe that it may not be worth it. but its naive to think that we got nothing out of war.

    ultimately what it comes down to is what we want and what the evidence suggests we will get. what we want is a civilization that doesn't rely on war and moves on to more efficient ways of advancement and getting the resources we need. what the evidence suggests we will get is probably what we have always had. periods of peace followed by periods of conflict and the only difference is that we will have better toys to kill each other with.

    and while yes diplomacy and public pressures will avert wars, there will be also many wars that will be waged regardless of what the civilian population wants. there will be coups, there will be madmen and there will be over bloated military budgets. just as there always have been.


    lastly, there is always the scenario of an alien species that is aggressive and militaristic that would want to dominate us. i don't think our non war stance would hold if say we stopped or completely minimized military research for hundreds prior to such aliens showing up. better to be peaceful while having all the big sticks.

  2. #107
    Registered User mylinar's Avatar
    Join Date
    Sep 2007
    Location
    Pittsburgh, PA USA
    Posts
    399
    Quote Originally Posted by psikeyhackr View Post
    If the people who want to start a war can't get enough peons to go along with the idiocy, then how do you have a war?

    psik
    Drones?

    After all we created the Atari generation to take care of all this for us. Think about it. Right now we have the lowest soldier to civilian (as in never ever been in the military) ratio than at any time in US history.

    I almost said 'our' forgetting that this is an international forum. My apologies for being to country-centric.
    Last edited by mylinar; September 6th, 2012 at 07:44 AM. Reason: missed a couple of words, need more coffee.

  3. #108
    I like SF. SF is cool. Steven L Jordan's Avatar
    Join Date
    Nov 2010
    Location
    Germantown, Md.
    Posts
    451
    Somehow, I don't think discussion of future war fits into the "What I think is lacking in most sci-fi stories" subject of this thread.

  4. #109
    Live Long & Suffer psikeyhackr's Avatar
    Join Date
    Feb 2008
    Location
    Sol III
    Posts
    2,735
    Quote Originally Posted by Steven L Jordan View Post
    Somehow, I don't think discussion of future war fits into the "What I think is lacking in most sci-fi stories" subject of this thread.
    Quote Originally Posted by mylinar View Post
    After all we created the Atari generation to take care of all this for us. Think about it.
    I was talking about what kind of people would be allowed in space and that "drones" could not survive there and probably get other people killed.

    But that is the problem with much SF. The writer cannot create very intelligent characters. Because the writer controls the background and knows what will happen the character can be made to seem smart in action but the thought process are just not very interesting. Some writers can do it but most cannot. None of the Star Wars characters are really intellectually interesting.

    psik

  5. #110
    Couch Commander Danogzilla's Avatar
    Join Date
    Jan 2012
    Location
    New England
    Posts
    617
    Do any of you put any credence in the idea Vernor Vinge (among others) has been promulgating for decades now, that 1) there is a technological singularity coming, and 2) we can't possibly predict life beyond that point?

  6. #111
    Live Long & Suffer psikeyhackr's Avatar
    Join Date
    Feb 2008
    Location
    Sol III
    Posts
    2,735
    Quote Originally Posted by Danogzilla View Post
    Do any of you put any credence in the idea Vernor Vinge (among others) has been promulgating for decades now, that 1) there is a technological singularity coming, and 2) we can't possibly predict life beyond that point?
    I don't buy the AI/transhuman junk. So we are getting faster, cheaper von Neumann machines. They are symbol manipulating devices. They do not understand the information.

    This relates to General Semantics that some sci-fi writers like Heinlein and A.E van Vogt were interested in back in the 50s.

    I do think we have the potential for a social singularity via education. I have a Google Nexus 7. Soon we will have dual-core Android tablets from China for $100. The trouble with the Nexus 7 is no microSD slot. Google decided it should be a media consumption device. But it has provided a standard for other tablet makers. So 7 inch Androids may take over education.

    What will we do with them? Or what will kids do with them?

    http://users.aber.ac.uk/dgc/funtheyhad.html

    Only 140 years early.

    psik

  7. #112
    Quote Originally Posted by psikeyhackr View Post
    I don't buy the AI/transhuman junk. So we are getting faster, cheaper von Neumann machines. They are symbol manipulating devices. They do not understand the information.

    This relates to General Semantics that some sci-fi writers like Heinlein and A.E van Vogt were interested in back in the 50s.

    psik
    You underestimate the power of AI There is a way they will be able to understand the information... research is going on as we speak...

  8. #113
    Registered User mylinar's Avatar
    Join Date
    Sep 2007
    Location
    Pittsburgh, PA USA
    Posts
    399
    Quote Originally Posted by Rosie Oliver View Post
    You underestimate the power of AI There is a way they will be able to understand the information... research is going on as we speak...
    I read this the other day. Interesting, but still a long way from practical.

    http://www.codeproject.com/News.aspx...669&_z=5132512

    By the way, mark me as skeptical not a nay-sayer. I just think that we have a tendency to overestimate how fast things will develop. As for the Singularity, well Mr. Kurzweil is a noted futurist and genius but his predictions are a very mixed bag. I am placing the Singularity within the unlikely category at least within the next generation or two.

    I hope I am wrong, but even if I am it is unlikely to happen when I am still here, perhaps my daughter though.
    Last edited by mylinar; September 6th, 2012 at 03:02 PM. Reason: hit send to quickly

  9. #114
    Live Long & Suffer psikeyhackr's Avatar
    Join Date
    Feb 2008
    Location
    Sol III
    Posts
    2,735
    Quote Originally Posted by Rosie Oliver View Post
    You underestimate the power of AI There is a way they will be able to understand the information... research is going on as we speak...
    We have been hearing that since the 60s.

    But it is curious that we almost never hear a good explanation of how a von Neumann machine works. Today's von Neumann machines just do the same things faster and cheaper. But now it is cheap enough to track your cookies and stick the proper advertising in your face.

    That is what is impacting us more than AI.

    The Space Merchants is more prophetic than Neuromancer.

    psik

  10. #115
    Quote Originally Posted by psikeyhackr View Post
    We have been hearing that since the 60s.

    But it is curious that we almost never hear a good explanation of how a von Neumann machine works. Today's von Neumann machines just do the same things faster and cheaper. But now it is cheap enough to track your cookies and stick the proper advertising in your face.

    That is what is impacting us more than AI.

    The Space Merchants is more prophetic than Neuromancer.

    psik
    Predicting exactly how the future will turn out technology-wise is very difficult, because of the need to take into account details. After all, the Romans could have had steam engines if they had put their mind to it. All they needed was to use one bit of technology to improve Hero of Alexandria's steam engine and it would have been a done deal... and yes they were already using that technology elsewhere.

    However, general trends are predictable. The need for quicker reactions from computers because we are asking more of them is inevitable... which means neural networks will be improved or superseded. The evolutionary process is already being mimicked by evolutionary algorithms. So here we already have two pieces of the puzzle... and like the invention of aircraft by putting the right technologies together in the right framework, someday someone will put together the computer algorithms to get a sentient AI. It's a case of when, not if...

  11. #116
    I like SF. SF is cool. Steven L Jordan's Avatar
    Join Date
    Nov 2010
    Location
    Germantown, Md.
    Posts
    451
    Quote Originally Posted by Danogzilla View Post
    Do any of you put any credence in the idea Vernor Vinge (among others) has been promulgating for decades now, that 1) there is a technological singularity coming, and 2) we can't possibly predict life beyond that point?
    In a word: No.

    In a few more words: I think Vinge and many others severely overestimate not the capability, but the reality of human-based technology to make such an incredible transformation of our world. They assume tech will continue to develop in all directions, unabated, and that will bring about the singularity. I say that opinion is unrealistic.

    My impression and experience with how humans deal with technology tells me that tech will continue to develop in narrow pathways, directed by largely immediate need and metered by an untrusting populace, giving us advances in some areas and stagnation in others, and a result that will be nowhere near a "singularity."

  12. #117
    Quote Originally Posted by Rosie Oliver View Post

    However, general trends are predictable. The need for quicker reactions from computers because we are asking more of them is inevitable... which means neural networks will be improved or superseded. The evolutionary process is already being mimicked by evolutionary algorithms. So here we already have two pieces of the puzzle... and like the invention of aircraft by putting the right technologies together in the right framework, someday someone will put together the computer algorithms to get a sentient AI. It's a case of when, not if...
    but what would the purpose be of a sentient AI? if we build computers and machines to do a certain specific task, why give it the ability to say no?

  13. #118
    Quote Originally Posted by warrior6 View Post
    but what would the purpose be of a sentient AI? if we build computers and machines to do a certain specific task, why give it the ability to say no?
    The main purpose of sentience in human beings has to be survival... it could be the same for sentient AI, but it could have a completely different purpose embedded in it - depends on what it was set up to do and what pushed it over the edge to become sentient.... This difference of purpose between AI sentience from survival sentience has never really been explored in science fiction. But to be faor it isn't the easiest of themes to write about...

  14. #119
    I like SF. SF is cool. Steven L Jordan's Avatar
    Join Date
    Nov 2010
    Location
    Germantown, Md.
    Posts
    451
    Sentience doesn't have to be based on individual survival. Survival of a group (including a sense of self-sacrifice) can override individual survival--just as it sometimes does with humans and other animals. Or the need to accomplish a task can override survival.

    Examine the actions of HAL 9000 in 2010 and you'll see an example of that.

  15. #120
    Quote Originally Posted by Rosie Oliver View Post
    The main purpose of sentience in human beings has to be survival... it could be the same for sentient AI, but it could have a completely different purpose embedded in it - depends on what it was set up to do and what pushed it over the edge to become sentient.... This difference of purpose between AI sentience from survival sentience has never really been explored in science fiction. But to be faor it isn't the easiest of themes to write about...
    im sure it would be easier and less dangerous to program basic defense and survival mechanisms into an AI (with limts rules) than build one that is sentient. beyond the sentience by accident scenario, i still dont really see why humans would go through the trouble of building sentient AI. and if it is the sentience by accident, humanity would most likely try to not make it happen again.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •