What I think is lacking in most sci-fi stories

Right. But unless we're talking about fiction, ants could not get themselves into space. We're getting side-tracked by semantics. Which is okay, I guess.

If you want to talk about fiction/fantasy and presuppose a highly involved race millions of years in the future that is directly descended from ants, and which has evolved the technology (and yes -- *shudder* -- the economy which would absolutely indispensible for providing the resources required to build flying machines that could both get Ants into orbit and could sustain life up there), then sure, that would be another interesting conversation. But it's a tangent to what was being talked about just before.

Which, of course, is what's cool about a thread like this :-) It drifts around fairly aimlessly.
 
Secondly, unless I'm misunderstanding you (which is very possible, lol.. my brain is slow today), you seem to be making the assumption that only stupid people start wars. I have a feeling some of the powerful people who have started wars (or made decisions that directly -- or indirectly -- led to war) in history were not exactly dummies. *shrug*

No, I am not talking about who starts them, I am talking about all of the cannon fodder that are supposed to follow. How many men died in World War I, supposedly because some archduke was assassinated?

How many men fought for the Confederacy even though they did not own slaves?

If the people who want to start a war can't get enough peons to go along with the idiocy, then how do you have a war?

psik
 
No, I am not talking about who starts them, I am talking about all of the cannon fodder that are supposed to follow. How many men died in World War I, supposedly because some archduke was assassinated?

How many men fought for the Confederacy even though they did not own slaves?

If the people who want to start a war can't get enough peons to go along with the idiocy, then how do you have a war?

psik

Nice theory but I offer a list of British volunteers for WW1 to illustrate a minor problem. Most are the poets commemorated at Poet's Corner http://en.wikipedia.org/wiki/Poet's_Corner#World_War_I_poets. There are also http://en.wikipedia.org/wiki/Henry_Moseley or http://en.wikipedia.org/wiki/Saki. I could probably find many others. http://en.wikipedia.org/wiki/Patrick_Blackett,_Baron_Blackett is perhaps special as he was already in the Navy in 1914.
 
And since we were talking about life in space or off Earth, there are many MUCH more expensive things in question than filling the population's bellies: hydroponic farms, atmosphere, radiation shielding, unique long-term medical requirements of living in a non-Terrestrial gravity and all the medical supplies that would entail -- and the mass-production of such supplies -- and the facilities / factories needed to mass-produce them.... etc etc... you name it.

We can't be content to exist unless we can produce the materials needed to exist. To produce those, we need an economy.

We have a problem with how the economy works with technology.

Ever notice how our so called economists do not talk about PLANNED OBSOLESCENCE? I haven't been to an auto show in more than 20 years. I don't give a damn what that junk looks like. Shouldn't people who like science fiction know that the Laws of Physics do not change year to year? But we don't even hear about the drag coefficient of these cars.

So the next question is what will really good robots do to the economy. How do you have an economy when 80% or more of the population does not need to work at all?

Are we only upgrading computers because we are making more bloated and inefficient software? Hey, now you can't run a Ma & Pa grocery store without a 3 GHz quad-core with 8 gig of RAM. :D

Can we mine the asteroids with just robots? That Mars landing was pretty impressive. Or maybe the robots can do all of the prospecting, not having to use fuel to push food and oxygen supplies around would make asteroid mining much more economical..

psik
 
Last edited:
Right. But unless we're talking about fiction, ants could not get themselves into space. We're getting side-tracked by semantics. Which is okay, I guess.

If you want to talk about fiction/fantasy and presuppose a highly involved race millions of years in the future that is directly descended from ants, and which has evolved the technology (and yes -- *shudder* -- the economy which would absolutely indispensible for providing the resources required to build flying machines that could both get Ants into orbit and could sustain life up there), then sure, that would be another interesting conversation. But it's a tangent to what was being talked about just before.

Which, of course, is what's cool about a thread like this :-) It drifts around fairly aimlessly.

Actually, I was simply assuming a small group of ants accidentally stowing aboard a human ship and thriving well enough (ie, finding enough to eat) to start a colony. (Hopefully they brought a queen.)
 
I disagree. I don't think war helps us. But a crisis can certainly help. If man were stripped of his warlike tendencies, he'd be forced to look for other solutions to problems.

There's lots of talk today about going to war with Iran. But if enough people speak up against the war-drums, diplomacy could win out.

2001 was kind of on the theme of the "continuation" of war-like tendencies within man. Personally, I don't think it's man's destiny to be warlike. On the contrary, I think man's war-tendency is a type of "short-circuit" response.


brian

while i agree that we aught to abandon war at least amongst ourselves, its naive to think that the entire species will comply.

and while i do agree that too much wars should be avoided simply from a moral point of view, i think it would be naive to discount how wars have allowed us to make leaps and bounds technologically. the factories that churned out 1000s of tanks in a short period of time gave us unprecedented insight into industrialization and production. the mass budgets into flight and rocketry during the second world war and the cold war made us make leaps and bounds that would be naive to think would have happened even without these wars in such a short period of time. the internet we use right now was a result of the military.

now still the millions dead and the fact that we can now set our civilization back 100s or 1000s of years back within hours still leads me to believe that it may not be worth it. but its naive to think that we got nothing out of war.

ultimately what it comes down to is what we want and what the evidence suggests we will get. what we want is a civilization that doesn't rely on war and moves on to more efficient ways of advancement and getting the resources we need. what the evidence suggests we will get is probably what we have always had. periods of peace followed by periods of conflict and the only difference is that we will have better toys to kill each other with.

and while yes diplomacy and public pressures will avert wars, there will be also many wars that will be waged regardless of what the civilian population wants. there will be coups, there will be madmen and there will be over bloated military budgets. just as there always have been.


lastly, there is always the scenario of an alien species that is aggressive and militaristic that would want to dominate us. i don't think our non war stance would hold if say we stopped or completely minimized military research for hundreds prior to such aliens showing up. better to be peaceful while having all the big sticks.
 
If the people who want to start a war can't get enough peons to go along with the idiocy, then how do you have a war?

psik

Drones?

After all we created the Atari generation to take care of all this for us. Think about it. Right now we have the lowest soldier to civilian (as in never ever been in the military) ratio than at any time in US history.

I almost said 'our' forgetting that this is an international forum. My apologies for being to country-centric.
 
Last edited:
Somehow, I don't think discussion of future war fits into the "What I think is lacking in most sci-fi stories" subject of this thread.
 
Somehow, I don't think discussion of future war fits into the "What I think is lacking in most sci-fi stories" subject of this thread.

After all we created the Atari generation to take care of all this for us. Think about it.

I was talking about what kind of people would be allowed in space and that "drones" could not survive there and probably get other people killed.

But that is the problem with much SF. The writer cannot create very intelligent characters. Because the writer controls the background and knows what will happen the character can be made to seem smart in action but the thought process are just not very interesting. Some writers can do it but most cannot. None of the Star Wars characters are really intellectually interesting.

psik
 
Do any of you put any credence in the idea Vernor Vinge (among others) has been promulgating for decades now, that 1) there is a technological singularity coming, and 2) we can't possibly predict life beyond that point?
 
Do any of you put any credence in the idea Vernor Vinge (among others) has been promulgating for decades now, that 1) there is a technological singularity coming, and 2) we can't possibly predict life beyond that point?

I don't buy the AI/transhuman junk. So we are getting faster, cheaper von Neumann machines. They are symbol manipulating devices. They do not understand the information.

This relates to General Semantics that some sci-fi writers like Heinlein and A.E van Vogt were interested in back in the 50s.

I do think we have the potential for a social singularity via education. I have a Google Nexus 7. Soon we will have dual-core Android tablets from China for $100. The trouble with the Nexus 7 is no microSD slot. Google decided it should be a media consumption device. But it has provided a standard for other tablet makers. So 7 inch Androids may take over education.

What will we do with them? Or what will kids do with them?

http://users.aber.ac.uk/dgc/funtheyhad.html

Only 140 years early.

psik
 
I don't buy the AI/transhuman junk. So we are getting faster, cheaper von Neumann machines. They are symbol manipulating devices. They do not understand the information.

This relates to General Semantics that some sci-fi writers like Heinlein and A.E van Vogt were interested in back in the 50s.

psik

You underestimate the power of AI :eek: There is a way they will be able to understand the information... research is going on as we speak...
 
You underestimate the power of AI :eek: There is a way they will be able to understand the information... research is going on as we speak...

I read this the other day. Interesting, but still a long way from practical.

http://www.codeproject.com/News.aspx?ntag=19837497392084669&_z=5132512

By the way, mark me as skeptical not a nay-sayer. I just think that we have a tendency to overestimate how fast things will develop. As for the Singularity, well Mr. Kurzweil is a noted futurist and genius but his predictions are a very mixed bag. I am placing the Singularity within the unlikely category at least within the next generation or two.

I hope I am wrong, but even if I am it is unlikely to happen when I am still here, perhaps my daughter though.
 
Last edited:
You underestimate the power of AI :eek: There is a way they will be able to understand the information... research is going on as we speak...

We have been hearing that since the 60s.

But it is curious that we almost never hear a good explanation of how a von Neumann machine works. Today's von Neumann machines just do the same things faster and cheaper. But now it is cheap enough to track your cookies and stick the proper advertising in your face.

That is what is impacting us more than AI.

The Space Merchants is more prophetic than Neuromancer.

psik
 
We have been hearing that since the 60s.

But it is curious that we almost never hear a good explanation of how a von Neumann machine works. Today's von Neumann machines just do the same things faster and cheaper. But now it is cheap enough to track your cookies and stick the proper advertising in your face.

That is what is impacting us more than AI.

The Space Merchants is more prophetic than Neuromancer.

psik

Predicting exactly how the future will turn out technology-wise is very difficult, because of the need to take into account details. After all, the Romans could have had steam engines if they had put their mind to it. All they needed was to use one bit of technology to improve Hero of Alexandria's steam engine and it would have been a done deal... and yes they were already using that technology elsewhere.

However, general trends are predictable. The need for quicker reactions from computers because we are asking more of them is inevitable... which means neural networks will be improved or superseded. The evolutionary process is already being mimicked by evolutionary algorithms. So here we already have two pieces of the puzzle... and like the invention of aircraft by putting the right technologies together in the right framework, someday someone will put together the computer algorithms to get a sentient AI. It's a case of when, not if...
 
Do any of you put any credence in the idea Vernor Vinge (among others) has been promulgating for decades now, that 1) there is a technological singularity coming, and 2) we can't possibly predict life beyond that point?

In a word: No.

In a few more words: I think Vinge and many others severely overestimate not the capability, but the reality of human-based technology to make such an incredible transformation of our world. They assume tech will continue to develop in all directions, unabated, and that will bring about the singularity. I say that opinion is unrealistic.

My impression and experience with how humans deal with technology tells me that tech will continue to develop in narrow pathways, directed by largely immediate need and metered by an untrusting populace, giving us advances in some areas and stagnation in others, and a result that will be nowhere near a "singularity."
 
However, general trends are predictable. The need for quicker reactions from computers because we are asking more of them is inevitable... which means neural networks will be improved or superseded. The evolutionary process is already being mimicked by evolutionary algorithms. So here we already have two pieces of the puzzle... and like the invention of aircraft by putting the right technologies together in the right framework, someday someone will put together the computer algorithms to get a sentient AI. It's a case of when, not if...

but what would the purpose be of a sentient AI? if we build computers and machines to do a certain specific task, why give it the ability to say no?
 
but what would the purpose be of a sentient AI? if we build computers and machines to do a certain specific task, why give it the ability to say no?

The main purpose of sentience in human beings has to be survival... it could be the same for sentient AI, but it could have a completely different purpose embedded in it - depends on what it was set up to do and what pushed it over the edge to become sentient.... This difference of purpose between AI sentience from survival sentience has never really been explored in science fiction. But to be faor it isn't the easiest of themes to write about...
 
Sentience doesn't have to be based on individual survival. Survival of a group (including a sense of self-sacrifice) can override individual survival--just as it sometimes does with humans and other animals. Or the need to accomplish a task can override survival.

Examine the actions of HAL 9000 in 2010 and you'll see an example of that.
 
The main purpose of sentience in human beings has to be survival... it could be the same for sentient AI, but it could have a completely different purpose embedded in it - depends on what it was set up to do and what pushed it over the edge to become sentient.... This difference of purpose between AI sentience from survival sentience has never really been explored in science fiction. But to be faor it isn't the easiest of themes to write about...

im sure it would be easier and less dangerous to program basic defense and survival mechanisms into an AI (with limts rules) than build one that is sentient. beyond the sentience by accident scenario, i still dont really see why humans would go through the trouble of building sentient AI. and if it is the sentience by accident, humanity would most likely try to not make it happen again.
 

Sponsors


We try to keep the forum as free of ads as possible, please consider supporting SFFWorld on Patreon


Your ad here.
Back
Top