The relationship between Quote and Curly

Jan 20, 2009 at 2:31 PM
In front of a computer
"Man, if only I had an apple..."
Join Date: Mar 1, 2008
Location: Grasstown
Posts: 1435
About Wedge's comment that computers are bound by their programming... that's true of normal computers, yes. But what if a computer is able to alter its own programming?

Jay City said:
Robots can evolve too IMO.
This is really the key point. The classic picture of robots can't evolve. That's a major reason why they cannot display human-like behaviour. And by evolve, I mean the evolution of the computer's "brain" over time – its body it really irrelevant to this. It's "brain" is essentially its programming.
 
Jan 20, 2009 at 8:59 PM
graters gonna grate
"Heavy swords for sale. Suitable for most RPG Protagonists. Apply now!"
Join Date: Jul 2, 2008
Location: &
Posts: 1886
Age: 31
Celtic Minstrel said:
About Wedge's comment that computers are bound by their programming... that's true of normal computers, yes. But what if a computer is able to alter its own programming?

This is really the key point. The classic picture of robots can't evolve. That's a major reason why they cannot display human-like behaviour. And by evolve, I mean the evolution of the computer's "brain" over time – its body it really irrelevant to this. It's "brain" is essentially its programming.

Hmmm, you might actually be right about that. I'm still a bit skeptical,though, because even if a robot's programming changes, the method with which the programming is interpreted does not change. The programming of a robot is not the most fundamental part of it, the processor and RAM is. You could think of the processor and RAM as the actual program, and the program as a data file that is being interpreted by the processor, which is basically no different from a "traditional" computer (in which the program can't rewrite itself).



Ok, I'm sorry but I just can't bring myself to drop the free-will debate. SP/andwhy, feel free to move this next part into a new thread if you think it's too off-topic.

I read that blog you mentioned, andwhy, and there was some interesting stuff there.

miller said:
Too often, people only care about the question of whether free will exists, when they should care about what "free will" really means. And what do we really mean by free will anyway? Does that mean that we're free from outside influences? Does it mean that we are morally culpable for our own actions? Does it mean that our actions are unpredictable? Is it necessarily a supernatural force?
miller said:
"Unpredictable <=> morally responsible" just seems like a non sequitur to me, but hey, that's a problem for philosophy, not for physics.

These are good points. I should have mentioned earlier, that when I say "free-will" I'm talking about both unpredictability and moral responsibility. I don't believe that unpredictability implies moral responsibility, or vice-versa; it is possible to have one without the other. I suppose I would define free-will as the combination of unpredictability and moral responsibility.


miller said:
Bonus question: what's the difference between free will and its illusion?

That, I have an answer to. In order to have true free-will as opposed to the illusion thereof, you need true unpredictability and moral responsibility as opposed to the illusions thereof.

The difference between true moral responsibility and the illusion thereof is pretty obvious; do you do what's right because you genuinely care about other people, or just because you were taught to or just because there's some reward in it for you?

The difference between true unpredictability and the illusion thereof is not so obvious.

miller said:
In principle, classical mechanics are completely deterministic, but in practice they are not. If you've got a single particle, you can easily predict its motion, but if you have more, it's not so simple. If you have N particles, you have to keep track of 3N numbers to specify their coordinates, and 3N more to specify their momentum. Once you have at least three particles, the gravitational equations already become impossible to solve without approximations. In any typical system on the human scale, we'll have on the order of 10^22 particles. Because it is so vastly impractical to keep track of so much information, we instead use statistical descriptions of such large systems. Temperature is one example; it describes a probability distribution of energies for each particle. And that's why determinism does not imply predictability.

Well, that's not necessarily correct, depending on how you define "predictability". It is definitely possible to have a deterministic system which gives an extremely convincing illusion of unpredictability. As Miller pointed out, the more particles you add to a system, the more difficult it becomes to predict what will happen in the system. Once you get up to a "normal" sized system with about 10^22 particles, it is so insanely difficult to predict exactly what will happen, that no human or computer could ever hope to do so. Therefore, we use those "statistical descriptions" like temperature, that Miller mentioned.

Now, suppose there exists some sort of omniscient, infinitely fast supercomputer that is capable of knowing the exact location, velocity, mass, and electric charge of every particle in the universe, and then calculating the exact location and velocity of every particle in the universe during every instant from the present time to infinitely in the future, all in a single instant.

In the deterministic universe described by classical, non-quantum physics, this supercomputer would always be able to unerringly, instantaneously predict the future of any system, regardless of whether there were 1, 2, 3, 10, 10^22, or 10^23049873478 particles in the system, so therefore the system is not truly unpredictable, it just gives an extremely good illusion of it.

A system that is truly unpredictable would be impossible for anyone or anything, including this theoretical supercomputer, to predict with 100% accuracy.

I'd say more, but this post already probably makes the top 5 longest posts ever written on these forums, so I'll restrain myself xD.
 
Jan 20, 2009 at 11:20 PM
Administrator
Forum Administrator
"Life begins and ends with Nu."
Join Date: Jul 15, 2007
Location: Australia
Posts: 6212
Age: 38
wedge of cheese said:
Hmmm, you might actually be right about that. I'm still a bit skeptical,though, because even if a robot's programming changes, the method with which the programming is interpreted does not change. The programming of a robot is not the most fundamental part of it, the processor and RAM is. You could think of the processor and RAM as the actual program, and the program as a data file that is being interpreted by the processor, which is basically no different from a "traditional" computer (in which the program can't rewrite itself).
You cannot change the way the human brain processes and distributes information. Why is that any different?
 
Jan 20, 2009 at 11:39 PM
Senior Member
"Huzzah!"
Join Date: Jan 11, 2009
Location:
Posts: 192
Guys this is going too serious, think out of the box, for god sake.
It's fiction that we are talking about, it's another reallity, the same laws can't be applied exactly.

Simple things as robots =/= blood links work, it's common sense, but things like RAMs and that?
Maybe they don't use ram for that, maybe they work with magic icecream you don't know!

I think that is clear for now that, they cannot be real siblings, only in heart and on arms, and that " & } care for each other, the "iron bond" is the most strong proof here, so there some kind of love, maybe the only kind possible, as robots.

Love wins as always :p
 
Jan 21, 2009 at 2:03 PM
In front of a computer
"Man, if only I had an apple..."
Join Date: Mar 1, 2008
Location: Grasstown
Posts: 1435
wedge of cheese said:
Hmmm, you might actually be right about that. I'm still a bit skeptical,though, because even if a robot's programming changes, the method with which the programming is interpreted does not change. The programming of a robot is not the most fundamental part of it, the processor and RAM is. You could think of the processor and RAM as the actual program, and the program as a data file that is being interpreted by the processor, which is basically no different from a "traditional" computer (in which the program can't rewrite itself).
Then create a processor that evolves.

andwhyisit said:
You cannot change the way the human brain processes and distributes information. Why is that any different?
The brain evolves over time with new experiences. It's called "learning".
 
Jan 21, 2009 at 11:28 PM
Administrator
Forum Administrator
"Life begins and ends with Nu."
Join Date: Jul 15, 2007
Location: Australia
Posts: 6212
Age: 38
Celtic Minstrel said:
The brain evolves over time with new experiences. It's called "learning".
The brain references those experiences, but that doesn't change the way the human brain processes and distributes information or "experiences".

As I mentioned before the human brain stores "experiences" and acts upon these stored "experiences" in many different ways for the various processes that make up who you are.

Jay City said:
Guys this is going too serious, think out of the box, for god sake.
It's fiction that we are talking about, it's another reallity, the same laws can't be applied exactly.
Good point.
 
Jan 22, 2009 at 6:12 AM
Lvl 1
Forum Moderator
"Life begins and ends with Nu."
Join Date: May 28, 2008
Location: PMMM MMO
Posts: 3713
Age: 32
andwhyisit said:
The brain references those experiences, but that doesn't change the way the human brain processes and distributes information or "experiences".

As I mentioned before the human brain stores "experiences" and acts upon these stored "experiences" in many different ways for the various processes that make up who you are.

I'd say a robot does not have free will, as a robot will blindly follow its programming. Which is exactly what humans do, what we "think" is our free will is the ability to choose, but in the end we can only end up making one choice based on our programming (logic, reasoning, the rules in our mind that we abide to), and even if we think we can make the other choice we only can if we find other data (find out something), or if we give it more thought (which will inherently make us find more data, as we collect data every day.)

Computers that can learn are being created i think somewhere, read an article about them once. Robots could basically be given "free-will", but it would still be limited to what it takes in and how it's programming decides to react to it, just like we do. In all reality, we are pretty much computers, carried out to do what we were made to do. Why we were made to survive and propagate our species is unknown to me (besides the reason that we must live on, it's like a competition among the species, why chemical reactions taking place became competive is beyond my knowledge). But robots would basically be the same. I'd say creating an AI, no matter how small, is literally creating life, for a program is pretty much an organism without mutations (the part that let's organisms evolve).

Therefore, Curly and Quote have as much free will as we do, which is pretty much non-existent.

...damn, this conversation has veered waaayyy off topic, should type something more related to the thread.

I'd say that Curly and Quote do have feelings for eachother, only because they were programmed to have such "feelings". They feel for each other for what the other has that is beneficial to them. Pretty much the same as human relationships, if not identical. Their behavior is just the product of their programming, nothing more. What the relationship between the two of them is exactly is pretty complicated, but the story never really implies love between the two, but it would be fair to draw that conclusion due to the "iron bond" and them going to live somewhere together (with balrog). As to why Curly would just be content with living out the rest of her days is beyond me, but it must be related to her core programming. For once a machine completes its task, it simply does nothing, unless there is more programming after. Therefore she is either breaking the laws of reality, or more likely there is code to deal with what to do with herself after her mission. It seems both Curly and Quote were programmed to mimic human behaviour, so technically everything they do will be as human as their creator (Pixel, or whoever pixel says their creator was, but he'll just leave it "up to our imaginations) wanted it to be.

To sum all that up, I'd say the two robots have feelings for each other (whether they can act on those feelings is undetermined), but how strong these feelings are is unknown.
 
Jan 22, 2009 at 7:54 AM
I WANT YEN LIN!!!
Bobomb says: "I need a hug!"
Join Date: Mar 21, 2008
Location: Where you don't
Posts: 761
Age: 15
Jay City said:
Guys this is going too serious, think out of the box, for god sake.
It's fiction that we are talking about, it's another reallity, the same laws can't be applied exactly.

Simple things as robots =/= blood links work, it's common sense, but things like RAMs and that?
Maybe they don't use ram for that, maybe they work with magic icecream you don't know!

I think that is clear for now that, they cannot be real siblings, only in heart and on arms, and that " & } care for each other, the "iron bond" is the most strong proof here, so there some kind of love, maybe the only kind possible, as robots.

Love wins as always :confused:
But if every fiction author thinks like that, I would expect nothing interesting to come out of art and thus, no point having discussions on this subforum.
If you say we're relating 'em too much to logics, you're keeping them too much to just utter, unlinked imagination.
Furthermore, you state it all goes down to love, one of the most cliche(damn notebook keyboard) excuses in the world of art.




The iron bond must be some kinda virus-esque programme that works wireless.
 
Jan 22, 2009 at 10:01 AM
Senior Member
"Huzzah!"
Join Date: Jan 11, 2009
Location:
Posts: 192
^ Isn't love (and the world of art) wonderfull? :confused:

Here we have an expression, you problably have an equivallent in yours language or even the same.
Rough Translated "Not 8 or 80".

We are all sure off one thing, Pixel's Robots, " & }, aren't just cold machines, he didn't made them that way, and you know it! God strike me a with thunder if grandpa mimiga is wrong!
Trying to applied rules of our world to letter, doesn't work, because our machines are to far to be like Quote or Curly.
The magic icecream was a joke, of course, also making guesses on wild and chaotic imagination it won't help too.

People can die form Dehydration or Water Intoxication, take the middle ;D
 
Jan 22, 2009 at 3:08 PM
Lvl 1
Forum Moderator
"Life begins and ends with Nu."
Join Date: May 28, 2008
Location: PMMM MMO
Posts: 3713
Age: 32
freezit4 said:
Furthermore, you state it all goes down to love, one of the most cliche(damn notebook keyboard) excuse in the world of art.

Cliches are cliches for a reason, they've worked before and they'll work again. In fiction, anything that is unlogical can be explained with love. It is the trump card of all literary devices.
 
Jan 22, 2009 at 6:09 PM
In front of a computer
"Man, if only I had an apple..."
Join Date: Mar 1, 2008
Location: Grasstown
Posts: 1435
andwhyisit said:
The brain references those experiences, but that doesn't change the way the human brain processes and distributes information or "experiences".

As I mentioned before the human brain stores "experiences" and acts upon these stored "experiences" in many different ways for the various processes that make up who you are.
But learning alters the brain eg forming new "neural pathways". So learning does change the way the brain processes to some extent.

GIRakaCHEEZER said:
Therefore she is either breaking the laws of reality, or more likely there is code to deal with what to do with herself after her mission.
There wouldn't have to be code to cover what to do after the mission. Curly and Quote are probably adaptive robots, which means that they can extend and alter their own programming (ie, they can "learn").
 
Jan 23, 2009 at 3:41 AM
Lvl 1
Forum Moderator
"Life begins and ends with Nu."
Join Date: May 28, 2008
Location: PMMM MMO
Posts: 3713
Age: 32
Celtic Minstrel said:
But learning alters the brain eg forming new "neural pathways". So learning does change the way the brain processes to some extent.

There wouldn't have to be code to cover what to do after the mission. Curly and Quote are probably adaptive robots, which means that they can extend and alter their own programming (ie, they can "learn").

Adaptive robots have code to make them adaptive, which in turn means that their "Adaptive code" would tell them what to do after their mission. Adaptive robots probably wouldn't be able to change their core programming, but if they were adaptive they could have seperate layers of code I suppose, and have a set of functions to edit the special "editable" code in their system. That would be some complicated stuff nonetheless. The fact that they possesed adaptive programming would be what would cover their actions after their mission. But they'd still need to be given objectives, or given the goal of finding new objectives to accomplish after the mission.
 
Jan 23, 2009 at 4:36 AM
Administrator
Forum Administrator
"Life begins and ends with Nu."
Join Date: Jul 15, 2007
Location: Australia
Posts: 6212
Age: 38
I personally think that unlike soldier robots, Quote and Curly were meant to emulate humans mentally and physically, and then exceed them physically, making them perfect for a 2 person mission since you can give them any memories or skills that you want and they can adapt like any human, but physically stronger. Think "Dollhouse", except minus the whole "reset memories" thing.
 
Jan 23, 2009 at 4:50 AM
Lvl 1
Forum Moderator
"Life begins and ends with Nu."
Join Date: May 28, 2008
Location: PMMM MMO
Posts: 3713
Age: 32
andwhyisit said:
I personally think that unlike soldier robots, Quote and Curly were meant to emulate humans mentally and physically,

I hope they weren't meant to emulate humans mentally, or otherwise they might be considered mentally retarded among other robots. But they probably were programmed to mimic the behaviour nonetheless, while still maintaining their processing power hopefully.
 
Feb 22, 2009 at 8:23 AM
Senior Member
"Huzzah!"
Join Date: Mar 24, 2008
Location: Florida
Posts: 197
Age: 32
The matter is the evidence is not enough either side. This is why fan-things are made. We, the fans, have the right to make a sub-plot that resonates with our opinions and ideas and beliefs of our character. If we make it to where Quote and Curly <3 by Love then we could show that. If we make it to where they're the best of family then we could show that.

Inconclusive matters are inconclusive. Pick your belief and stick to it.

I honestly though like the <3 between the 2. Since I believe the Iron bond has a heart to signify her love for him, but that's just me.

One thing though that baffles me is I hear CURLY BRACE is written on his hat.
That kinda supports the sibling more, since if they knew eachother that far back they might have been made my the same creator. But then why doesn't she have anything with "Quote" on it?

Look at me, getting all worked up, I guess I'll leave it as I previously stated, leave it to the imagination.
 
Mar 4, 2009 at 12:51 AM
Senior Member
"Wahoo! Upgrade!"
Join Date: Mar 4, 2009
Location: Over the Sun
Posts: 53
I'm for the Quurly (Cote? Quorly?) pairing mostly because Sue is a different species and Curly's pretty hot. I mean come on, she sacrifices herself for Quote and then you save her!
 
Mar 4, 2009 at 5:23 PM
In front of a computer
"Man, if only I had an apple..."
Join Date: Mar 1, 2008
Location: Grasstown
Posts: 1435
Logic fail.
 
Mar 4, 2009 at 7:56 PM
graters gonna grate
"Heavy swords for sale. Suitable for most RPG Protagonists. Apply now!"
Join Date: Jul 2, 2008
Location: &
Posts: 1886
Age: 31
How is that a logic fail!?
 
Mar 4, 2009 at 9:22 PM
In front of a computer
"Man, if only I had an apple..."
Join Date: Mar 1, 2008
Location: Grasstown
Posts: 1435
I've already forgotten... :eek:
 
Mar 5, 2009 at 9:06 AM
I WANT YEN LIN!!!
Bobomb says: "I need a hug!"
Join Date: Mar 21, 2008
Location: Where you don't
Posts: 761
Age: 15
Because you is not Quote?
 
Top