Screencap from Deja Q:

Working In Groups

Chapter 6

When Q returned to engineering, he felt like a spring on high compression, wound unbelievably tight.� The hope implied in Data's promise to help him and Troi's promise to get Picard's permission so Data wouldn't get in trouble � not that he'd let her know that that was the goal he was aiming for, but it took a great load off his mind to imagine that Troi could get Picard's permission and therefore Data wouldn't be hurt for helping him � should have made him feel better, and in a sense it did � the bleak despair he'd felt earlier when he'd been certain of death had been worse than almost anything else.� But the hope was making him agonizingly tense, wired and jumpy and desperate to begin the work.� The uncertainty had his mind going around in useless circles.� He couldn't plan for the future without knowing if he had one or not.� He couldn't plot out how to solve the technical issues around his plan for his own survival because he didn't have enough information yet.� And the gap left in his mind by the absence of anything useful he could turn his mental resources on was being filled with completely unproductive speculation, such as how much dying would hurt, and would any of the Q actually care when he died, and who would be taking over the projects he'd left undone or if anyone would even bother.�

The situation with the Bre'el III satellites had to be essentially resolved by now, he thought.� LaForge was right, really; he hadn't been needed for the project.� His strength was his advanced knowledge and intuition; he should be a source of creative ideas, and the implementation should be up to the engineers who'd spent their tiny existences studying this limited technology.� So really, all he should need to do at this point was check in, and then commandeer one of the terminals for research on his project.

But as he walked in, LaForge said, "Hey, Q.� We're getting reports from Bre'el IV that they're still suffering considerable tidal stress effects.� The coasts are still flooded and there's still quakes going on along the fault lines, even though we're holding the moon in place.� Anything we can do about that?"

Q stared at him in shock and disbelief.� "What, exactly, did you think we could do about a black hole?� Or did you once again fail to realize that if there's a black hole tugging a moon out of orbit it's bound to generate some tidal stress on a much bigger planet in the same general area as well?"

"I didn't fail to realize that the black hole might be causing tidal stresses, I just wondered if you knew of anything we could do about it," LaForge said.� "They can't get rescue workers into some of the worst quake zones to get people out from under the buildings, because they keep having quakes�"

"And what exactly did you expect me to do about it?" Q asked, almost shrilly.� "I'll just snap my fingers and remove the black hole for you, is that it?� Oh, wait, I have no powers!� Perhaps you'd like to try pushing the black hole away like you did with the moon!� Because that will work so well with your primitive technology!"

"I didn't expect you to be able to do anything, I was just asking.� You're our current resident expert on extreme science, and I figured if anyone would know of something to do, you might, but if you don't�"

"You're right, I am vastly more intelligent and knowledgeable than you � which includes the ability to tell a hopeless situation when I see one.� You've already moved the moon for these people!� What more do they want?� There's nothing your inferior, ignorant, under-evolved minds are capable of inventing that could possibly do anything about the gravitational pull of a black hole and the tidal stresses it's causing, and the fact that you even think such a thing is possible�"

"You know what, Q?� Go home.� You're in no shape�"

"Go home?� Exactly how am I supposed to do that?� They threw me out!"

LaForge sighed sharply in exasperation.� "I didn't mean the Continuum, I meant your quarters�"

"My quarters aboard this pathetic, benighted little starship are hardly home�"

"I don't care where you go, but I want you to leave Engineering.� Now.� I don't know what you and Counselor Troi discussed and if it has anything to do with why you're flipping your lid, but you're overwrought, you're screaming at people�"

"I am not!� You people are so averse to any kind of conflict within your mammalian pack hierarchies that you can't even handle a slightly raised voice�"

"Q!"� Now it was LaForge who was shouting.� "Shut up!"

Startled, Q did shut up.� "You are in no shape to work here," LaForge continued.� "Go back to your quarters, or somewhere, I don't care where but I don't want you here."

"And what am I supposed to do, just twiddle my thumbs until the Calamarain come for me?" Q asked harshly.�

"I don't actually care.� Until you can get your emotions under control and stop shouting at people, you can't work here."� He pointed at the door to Engineering.� "Go."

Q stared at him, and then at the door.� "I don't�I haven't got�"

"Q, do I have to call Security?"

"Geordi," Data said.� "I believe it would be best if I escorted Q to his quarters and remained with him until he is able to regain his equilibrium."

Q turned on Data, shocked, feeling betrayed.� LaForge didn't seem to like the idea either.� "Data, I could really use you here�"

"I do not believe I am necessary at the moment.� We have fully plotted a plan of attack to bring the Bre'el III moons down, and if there is nothing of use we are able to do to aid with the tidal forces on Bre'el III, then there is no further need for me here.� Our best hope to provide any sort of assistance in the matter of alleviating the tidal stresses would be to allow Q to calm down and give the matter more serious thought, as he is correct that our technological levels are inadequate to the task of dealing with a black hole."

"And what, you think if you babysit me that I'm magically going to think of a solution to a problem that has none?� Forget it!� I don't need you!" Q shouted at Data.

"Data, it's not going to help," LaForge said.

"Nonetheless, I intend to try."

LaForge threw up his hands.� "I really wanted you here, but you outrank me, Data.� If you feel like this is what you've got to do, I can't stop you."

"I do believe that this would be my best course of action."

"Well, I don't!" Q said.� "I don't need a babysitter, I don't need someone to calm me down and I don't need you, Data!� Stay here with LaForge, he obviously needs a babysitter more than I do, since he's the one whining about how he can't do without you�"

"Q, I am third in command aboard the Enterprise.� In the absence of orders from Captain Picard or Commander Riker, or medically advised orders from Dr. Crusher, that counter my decisions, I have the right to give you orders.� I am therefore ordering you to accompany me from Engineering."

Q was breathing hard, his hands shaking.� LaForge interfering with his plan to do the research required to save himself by demanding he do something impossible and then kicking him out of Engineering when he refused was bad enough, but Data, who had promised to help him, going along with it � to the point of ordering him around � was like a slap to his face.� "Fine!� Whatever!� Do what you want!"�

He stalked out of Engineering.� As he headed down the cross corridor he had to take to reach the closest turbolift to his quarters, Data said, "We are not going to your quarters, Q.� We are going to mine."

Q spun and stared at him.� "What, you don't even trust me under house arrest in my own quarters, you have to watch me in your own?"

"No," Data said in his usual reasonable tone.� "I do not believe your room has sufficient computer interfaces to allow us to both efficiently work on your project."

Q stopped dead in the middle of the hallway.� "My... project?"

"Your project to discover how to utilize the transporter to duplicate yourself," Data said.� "I find it surprising that you have forgotten, considering how recently you expressed the belief that it is your only hope for survival."

He felt as if he'd been poleaxed.� "I didn't forget, Data, I... I just didn't realize that... I thought LaForge was..."

"Geordi is unaware of the project," Data said.� "I did not know if you had provoked the argument with him in order to be sent from Engineering so you would be able to work in privacy, or if the argument was accidental, but I saw the opportunity to go with you so we could work together, as I had promised, without alerting Geordi to the fact that we would be working on your personal project."

Q took a deep, ragged breath, trying to calm himself down.� The whiplash, from anger to relief, was stunning.� He felt like he was about to collapse here in the middle of the hallway.� Which was totally unacceptable.� "I�I didn't realize that was what you were going for.� I, uh... hey, isn't that essentially lying to LaForge?"

"My ethical program does not forbid me to lie if it is necessary to do so in order to accomplish an ethical objective, such as saving a life," Data said.� "Moreover, I did not in fact lie to Geordi.� If you and I succeed in identifying a method of using the transporter to duplicate yourself, I believe you will be much calmer and better able to consider solutions for the problems Bre'el IV is still facing."

"You told him you were going to escort me to my quarters."

"I have changed my mind," Data said blandly.� "My quarters would be more appropriate."

Q laughed, harder than the situation really warranted, and cut himself off before it turned into hysterics.� "Oh, Data.� You amaze me sometimes."�

He followed Data in the direction toward Data's quarters, a sensation not entirely unfamiliar, but one he was not commonly accustomed to, burning through him.� He had completely misread Data's intentions, had felt hurt and betrayed and had lashed out at Data for doing exactly what he had wanted Data to do, and it made him feel... ashamed.� It wasn't exactly the first time � it wasn't even that it had been new when he'd first felt it as a human, when Data had almost died for him, since it was rather difficult to be thrown out by one's entire species for crimes and violations of protocol without feeling shame about it � but it wasn't an emotion he was particularly used to, and it hurt.� Reluctantly, because he didn't want to lose face by stooping to apologize, but he couldn't see any other way to alleviate the guilt and shame he felt, he said, "I'm, uh, I'm sorry I snapped at you.� I was... I thought I wasn't going to be able to work on this, because LaForge had those questions about the tidal stresses, and I... well, I'm under a lot of stress right now."

"It is not necessary to apologize to me," Data said.� "I am an android.� I am incapable of taking offense at your actions."

Q rolled his eyes.� "I'm not apologizing to make you feel better, Data, I'm apologizing to make me feel better.� I did something stupid, and I got angry at you and it wasn't justified and I'm sorry about it."

"In that case, I accept your apology," Data said.� "Although, I cannot accept your apology on any behalf but my own.� If you intend to apologize to Commander LaForge, you will have to do so directly."

"I wasn't planning on it.� I'm not sorry I yelled at him."

"Why not?"

From anyone else, it might have been a pointed question.� From Data, it was mere curiosity.� "LaForge really did ask me a stupid question when I had better things to do."

"But he is not aware of your 'better things to do', as we are concealing that information from him.� And given the extent to which your knowledge of physics surpasses ours, it was impossible for him to know that it was a 'stupid question', as you say, until you had responded to it."

Q did not want to be convinced that LaForge had been justified and he'd been overreacting.� "Anyway, you know, you really need to stop doing that 'I'm an android' thing."

"If you are referring to my statement that I am an android, it is a truthful statement.� I am not sure why you believe I need to stop stating the facts of my nature."

"It's not that you're an android, it's the whole 'I am an android, so you don't need to treat me with respect, like you would any other sentient being' thing you do.� Why do you even do that?� Why tell people they don't need to apologize to you?"

"Because I cannot be offended by a lack of an apology."

"So you think the only reason people apologize is that they're sucking up, trying to make other people overlook their bad behavior?� Or do you just think that about me?"

Data looked over at Q, as they were now walking more or less parallel.� "It is true that it seems uncharacteristic of you to apologize."

"Well, I don't normally do anything I feel sorry about doing.� And trust me, Data, if I was any good at sucking up to people who think I've done them wrong and getting them to let bygones be bygones, I wouldn't be here."

Data cocked his head.� "I had not considered that, but you do make a valid point."

"And it doesn't matter anyway.� People don't need to apologize to non-sentient machines, or holograms, or animals far beneath them on the evolutionary ladder, but if the concept of an apology is meaningful at all, then it's a protocol that sentient beings exhibit with each other.� You're a sentient being, therefore you deserve an apology when it's justified."

"That opinion is not universally held," Data said.

"Yes, there's no shortage of idiocy in the universe, but you don't need to pander to it."

"If you are so convinced that I am a sentient being who deserves to be treated in an identical manner to other sentient beings, why did you refer to me as 'The robot who teaches the course in the humanities' in Ten-Forward two days ago?"

"Did you ever notice me refraining from insulting any other sentient beings, Data?� It is ironic that Picard assigned an android to teach me about being human. Anyway, I was angry at Guinan.� She'd just stabbed my hand with a fork, remember?"

"Why would your anger at Guinan lead to you attempting to insult me?"

Q sighed.� "Can we talk about something else?"�

"Very well," Data said.� "I am concerned that it may not be possible to do what you hope to be able to do.� I have performed some preliminary research, and I have concluded that there is insufficient data for us to be able to reliably construct a reflective screen for a transporter signal.� However, you may have the additional information I would require.� Otherwise, I would be forced to conclude that we would require more time, as well as the ability to directly analyze certain planetary atmospheres, and it would be impossible to accomplish what you mean to do within two days."

It was funny, Q thought, that all the ridiculous metaphors humans used to describe their emotional states actually turned out to be barely metaphorical at all.� It really did feel like his blood was somehow much colder than it ought to be.� Data had warned him earlier that he wasn't sure the duplication could be done, but he hadn't really looked into it then.� Now, apparently, he had... and was more certain than he was before that it couldn't be done.� Maybe he would have preferred talking about why he had insulted Data in Ten-Forward two days ago after all.

But then, this was what he'd wanted.� "What have you got?"

"I have begun by analyzing instances of transporter duplication.� There are five in Federation databanks.� The most recent, and the only one which had the result of a full identical duplicate, is the incident with Commander Riker when he was a lieutenant aboard the Potemkin... which we were unaware had resulted in a transporter duplicate until you informed us.� The first incident, which was reliably found to create duplicates who differed considerably in psychology, occurred to the original starship Enterprise under the command of Captain Kirk..."

By the time Data had finished disgorging everything he knew about the five transporter incidents, they had reached his quarters and begun setting up a second terminal for Q to use (which involved moving a lot of art supplies, myriad books, and many exotic costume supplies out of the way... apparently being an android did not necessarily involve being particularly neat.)� It was much too much information, really, but Q didn't dare tell him to skip something, because he had no idea what amidst Data's babble might prove relevant.� The truth was, he really didn't remember all that much about the mechanics of how the transporter accidents had happened; he had just known it, at the time, but he hadn't put much thought into it, and things he hadn't thought about much when he was a Q hadn't generally carried into his human brain very well.

"In short," Data said finally, much too late for the expression to be at all accurate, "all such incidents have taken place in an environment of severe atmospheric disturbance.� I have attempted to initiate two beams, in computer simulations, but even with the fail-safes removed the beam will automatically merge back into itself in the buffer, and reform into a single transport object.� There does not appear to be a way to force the beams to remain separate, and pull mass from the target location to form a second instance of the transport object, short of refracting the beam through a planetary atmosphere undergoing severe ionic disturbance."

"Can we mimic the ionic disturbance using the environmental controls?" Q asked.� "Your starships are creating an artificial atmosphere anyway."

"I had considered that," Data said, "but I do not believe so, because the starship is not large enough.� When a surface-to-ship transport is performed, the transport beam must pass through kilometers of atmosphere, typically.� This is sufficient distance for even a slight refraction in the beam to cause the beam to wholly separate.� The Enterprise is not a full kilometer in length, and the gravitic effects caused by an even pressure of artificial gravity generated by the deck plates are not similar enough to the gravitic effects caused by several kilometers of atmosphere on top of a gravity well to produce similar conditions of pressure and atmospheric disturbance.� I am also not sure the environmental controls can actually generate this type of atmospheric disturbance, at least not to the degree required."

"Well, there's got to be something we can do to split the beam."

"I am certain that there is," Data said.� "But with only five known incidents to study, only one of which produced the sort of duplicate you wish to create, I do not believe we have enough information to derive the general principle behind the effect.� With so little data to study, we would be forced to exactly replicate conditions that were known to cause a transporter duplication... but we cannot possibly do so while in the Bre'el system."� He pulled up the specifications for the various planetary atmospheres.� "While the gas giants Bre'el VII and Bre'el IX have similar elemental makeup to the atmosphere of Nervala IV, where Commander Riker was duplicated, the similarity is not sufficient to duplicate the effect.� There have been multiple transports to and from the depths of the gas giants' atmospheres as part of the Bre'elians' mining projects, and none have experienced the issues that the Potemkin experienced with transporting Commander Riker.� Also, no duplicates have been identified."

Q took a deep breath, trying to control his voice.� "You're saying it's hopeless."

"No.� I am saying that we, Starfleet, have insufficient understanding of the principles that caused the effect to duplicate them, and we lack the time and the freedom necessary to do the research that would be required to obtain such an understanding.� However, your knowledge differs from ours, and is in general both considerably greater and reflects a deeper understanding of fundamental principles.� If you were able to explain the principles that caused the beam's refraction to me, I may be able to derive a means to apply those principles using our technology."

Data's speech patterns almost always sounded upbeat, almost cheerful, even as his words were matter-of-fact.� Q knew that he was imagining a hopeful tone in Data's voice, that Data wasn't capable of that degree of emotion, and if he was, he probably wouldn't care enough about Q's survival to be hopeful or not.� But it sounded like Data was hopeful, like Data wanted Q to survive and was optimistic about the prospect of Q knowing enough about the scientific principles involved here to be able to figure out how to generate the effect.

Unfortunately, Q himself couldn't be nearly so optimistic.� The truth was, he had never been all-knowing.� The Continuum was all-knowing, and individual Q could obtain any knowledge they wanted, instantly, by querying the Continuum for it.� Individual Q also were vastly more intelligent than humans, had far more processing power in their minds, and memories that could retain an eidetic recollection of billions of years, because they weren't limited to the size of a piece of matter that could fit in the space of a humanoid skull, let alone to a material as inefficient as neural tissue built on a platform of hydrocarbons.� When Q had chosen to be human, he had pulled together the parts of himself that he thought of as himself, his ego, his personality, and compacted as many of his personal memories as he could in around that core so he could fit as much of his selfness as he could into the much smaller, weaker, less efficient substrate of a human brain.� The only Q-knowledge that had come with him into this feeble piece of meat had been knowledge that had been tied closely enough to personal memories that it had automatically come with those memories, because he hadn't wanted to waste limited brainpower on academic, unused knowledge that might or might not ever have any value to him if he could take memories and experiences from his life instead.�

The knowledge of the fundamental principles behind the way a primitive mortal matter-to-energy transport technology might be refracted to produce two copies of the same being hadn't been one of those.

Maybe... maybe he could deduce it.� He did have a lot of other knowledge about the way the universe worked and the underpinnings of physics that the mortals didn't have.� Maybe he could figure it out.� Or remember it.� It was possible that it had come over with him, wasn't it?� Humans could forget things and then turn out to remember them later � unlike the Q, who either remembered something or they didn't, humans could stash memories in parts of their brains where they couldn't easily access them, and then they could pull them back out if they were exposed to a reminder.� He'd taken a good look at the two William Rikers and the circumstances that had created two of them, because if he'd wanted to use the Riker trapped on the planet as a human control to compare the Riker who became a Q to, he had to make sure the Rikers were fundamentally the same.� So maybe, just maybe, he had brought that knowledge with him and he just couldn't remember it right now.� Maybe if he looked over Data's simulations, he'd remember.� Or figure it out.

"Let me see the data you've got," Q said.� Maybe something in a review of the transporter accidents known to Starfleet would jog his memory.� If, in fact, that information existed in his memory at all.� Which he doubted, but it was the only hope he had.

For nearly three hours, he and Data reviewed what they knew about transporter duplication accidents, ran simulations, experimented with force fields and local manipulations of environmental controls, and in general battered their heads against a metaphoric brick wall.� There wasn't enough information.� He couldn't remember why the beam had reflected, not to the level of detail where he could figure out how to recreate the phenomenon, and five reports on incidents in the past weren't enough for him to work it out.�

Eventually, he slammed the PADD he'd been working on down on the table.� "Forget it, Data.� You were right the first time.� We can't do this."

For a moment, Data said nothing.� "I am sorry," he said finally.�

Q shook his head.� "Sorry about what?" he asked, his own voice sounding flat, mechanical, to him.� "You didn't do anything."

"I am expressing regret that we were unable to solve the problem," Data said.� "It would have been an elegant solution to the situation you face."

"Elegant.� Yeah."� Q laughed, bitterly.� "Well, all of Picard's ludicrous rhetoric about how morally bankrupt I am for wanting to create a copy of myself so I can survive my own death has just been rendered spectacularly moot, at least.� Troi said she'd talk to him, but I wasn't looking forward to that conversation; your captain can be an unbelievably stubborn human when he thinks he's right.� So, hey, at least I don't have to convince him he's wrong, now."

"That does not sound as if it is of sufficient value to be a consolation," Data said, sounding puzzled.

Q laughed again.� "It's not, Data.� Trust me.� I'm just... what's the human expression?� Looking on the bright side?"� He sat down heavily on the unused bed in Data's quarters.� "You knew this wasn't going to work, didn't you?"

"When we began the research, I believed it was possible that you had knowledge which would enable us to make it work, as I said."

"Who would have ever thought my life would depend on if I can remember what causes a transporter beam to split?" Q stared at the wall, shaking his head again.� "I mean, it sounds ridiculous, doesn't it?� Why would anyone ever need to know such a thing?� And yet... here I am."

"It is possible that we will come up with a different solution," Data said earnestly.

"Yeah, no.� That's not happening, and you know it."� He looked at Data.� "How long were we working on this thing where you knew for a fact we weren't going to come up with an answer?"

"I concluded that the likelihood of our success had dropped to a sufficiently infinitesimal level that Captain Picard would find my expressing it with the necessary number of decimal places to be irritating, approximately one point three seven hours ago."

"And yet you kept working on it with me."

"I did not believe it was my place to make the judgement that we should give up, as it is not my life that is at stake."

His eyes stung, and he felt a wave of gratitude, of... what was he feeling?� It almost hurt, it was so intense.� Was this... not love, that was ridiculous, he would never be victim to such a soppy and ludicrous emotion, but... tenderness?� If he had the power, right now, he would do anything for Data, anything at all.� "Data, I... thanks."

"We were unable to solve the problem," Data said.� "I do not believe I have done anything that warrants your thanks."

"Well, you're wrong," Q said.� "You have.� You tried.� Picard didn't even want to try, for some stupid superstitious application of ancient human morality, but you believed me that there wouldn't have been anything wrong with copying myself, if it had turned out that we could pull it off.� You helped me figure out for myself that we can't do it.� And you let me figure it out for myself after you already knew we were wasting our time, so I'd... so I'd accept it."� He swallowed.� "I'm... okay with it now, you know?� I... I accept it.� We tried our best, and it didn't work out, and... I'm going to die, but at least, at least someone cared enough to try to help me avoid it.� It just... didn't work out.� Reality's like that, right?� Sometimes... it doesn't matter how much you want something, it just isn't going to happen, because that's not the way the universe works.� So... so it's all right."

In a tone that sounded almost hesitant, Data said, "Do you wish to speak to Counselor Troi?"

"No, why would I want to do that?"

"You are stating that you accept this outcome, that you are 'okay with it,' but the tone of your voice and the pattern of your speech indicates considerable emotional distress.� I am not equipped to assist a human being in emotional distress, if it is not possible for me to solve the problem that is distressing them."

Q laughed, harder than he probably should be laughing, and forced himself to stop before the laughter turned into hysteria.� "Data, I'm going to be tortured to death by idiots who resent me for a lesson I taught them three hundred years ago, in less than two days from now, and the only solution I could come up with to survive it in any way has just turned out to be non-viable.� Of course I'm distressed.� But there's nothing you or anyone else can do about it."� He smiled sadly.� "You've already done more than anyone else has.� I'm... I admit it, I can't very well pretend I'm happy about the prospect of dying.� Or, honestly, that I'm not terrified.� But... I don't know why, I don't think it makes any sense and I don't understand it, but I really do feel better about it knowing that you tried your best to save me.� Even though it didn't work."

Data cocked his head slightly.� "Humans often express a desire for reciprocity in their relations... the notion that others would treat one in the way that one treats those others is considered desirable, in most cases.� Perhaps you perceive your intention to sacrifice yourself to the Calamarain as something you are doing for our benefit, and thus, the thought that we would not attempt to assist you in avoiding that fate caused you to feel that there was a lack of reciprocity... that we would not do for you what you have decided to do for us."

Q shook his head.� "I already knew you were willing to risk your life for me, Data.� That's not news."

"True."

"Besides, logically it shouldn't matter if you all hate my guts... my one life still doesn't outweigh all the lives on this ship, not if I'm not a Q."

"I do not believe that that is the metric humans generally use to decide on a course of self sacrifice," Data said, "and I find it difficult to believe that you, in particular, would use such a metric when one side of the equation is your own life.� You have always seemed very self-centered... it does not seem plausible that you genuinely believe that the life of others is worth as much as your own."

"What, you think I'm a selfish, worthless waste of protoplasm too?"� Data calling him self-centered actually hurt, because Data didn't say things to hurt people, Data said things when he thought they were true... and he was right, of course.� Q was self-centered, and he didn't know how not to be; a Q who wasn't self-centered wouldn't continue to exist as a separate self, and Q felt torn between shame because he didn't live up to the ethical standards of the species he'd adopted himself into, and angry that they'd have such standards, because how did a species even survive without considering self-protection and individualism a virtue?�

Data raised his eyebrows slightly.� "I do not believe that is what I said.� The term 'self centered' does not generally imply the meaning 'selfish, worthless waste of protoplasm.'"

"Oh, yes it does," Q said bitterly.� "When humans say it."

"While I am not sure that is the case, it is possible that I am not programmed with the full range of meanings for the term.� If that is so, I apologize," Data said.� "I was merely curious why it would be that, now that you have become human, you state that you consider your own life to be exactly equivalent to the lives of other humans, when that is not consistent with your behavior."

"Who cares what I think?" Q snapped, jumping to his feet.� He paced as he talked, restless energy and violent emotions surging through him, forcing him into motion.� "Yes, all right, fine, I don't want to die to save a ship full of other human beings who could hardly care less whether I live or die.� You've caught me out.� I don't feel they're worth my life, and I don't want to die for them, and if I could just convince myself that I don't owe any of you anything and it wouldn't be my fault if I ran for it and then the Calamarain killed you all for revenge... except it wouldn't work anyway, because I can't outrun the Calamarain in a shuttle, and I don't want to be trapped on Bre'el IV for the rest of my human existence and I can't guarantee they wouldn't figure out how to follow me down into atmosphere, and none of it matters because it would be my fault.� And because I do owe you.� The Q pay their debts.� Admittedly I'm not sure any Q has ever owed anyone their life before, but that just makes the obligation that much bigger."� He was breathing hard.� "If I didn't owe any of you anything... but Picard agreed to take me in and protect me even though he really doesn't like me much, and Troi flipped the forcefields up and down to drive the Calamarain out when we were in the shuttle together working on the telepathic amplifier, and you nearly died for me, and I can't�"

"I did not save your life to create a sense of obligation in you," Data said gently.� "I did so for my own reasons.� I am not programmed to allow a person to be killed in front of me, and to do nothing to stop it.� As a sentient being, I have the ability to violate my programming, but it causes me distress to do so."

"Yeah?� Well, I got news for you, Data, all sentient beings are programmed, and we all have the ability to violate it, and it causes all of us distress to do so.� You're not the special snowflake you think you are."� This was ridiculous.� Why was he hyperventilating, coming close to sobbing again?� He didn't want to be talking about this; he didn't even want to be thinking about it.�

He leaned against the wall, taking deep breaths, trying to control himself.� "The Q pay their debts, all right?� And I can look at a bunch of mortal beings and I can assess whether or not it makes sense for some of them to die to save the others because I'm not mortal and I'm not involved, I'm an objective observer, so I can make that call.� And then it turns out when I am mortal and I am involved and I'm not objective at all... I can still make that call.� Even though I really, really hate the answer."�

"Interesting.� I would have expected that it would be necessary to be a disinterested, objective observer to make any such determination."

Q shook his head.� "To a Q, all mortals are potentially of equal worth.� You'd have to do some complex balancing of various traits, and potentials, and the future possibilities on their timeline, to be able to say that one mortal's objectively valuable enough to be worth the lives of several others.� And since I'm not a Q now, I can't make that determination.� I can't compare this pathetic mortal who used to be a Q to other pathetic mortals and decide who's more valuable, because I'm not a Q anymore.� I don't have access to the information I'd use to make the assessment.� And my personal feelings on the matter can't be introduced into an objective assessment because we're talking about my life here, of course I don't want to die, so there's a bias and I've got to take anything I might feel about my own worth out of the equation because I'm not objective there at all.� So what I'm left with is, I have to presume all the mortals in the situation are of equal worth, and then..."� he swallowed.� "And then, if I make any attempt to assess value, objectively, I have to observe that there's one mortal in this situation who everyone hates, who has no idea how to survive for himself, who's considered morally inferior by the other mortals, who's already been judged by the Q to be a waste of space, and who's absolutely miserable anyway... and by remarkable coincidence, if that one mortal dies, everyone else gets to live... so it's a no-brainer. I mean, Worf could figure this one out."� He walked over to the bed and sat down, heavily.� "But, I mean, just because it's obvious doesn't mean I like it.� And... and I don't know, if all of you hated me and you'd done nothing whatsoever for my benefit, maybe I could have decided that I just don't care what the obvious correct answer is here... but that's not true.� I owe you.� And I keep still owing you.� And maybe Picard thinks his precious ethics about not making copies of people for whatever ridiculous stupid reason are more important than my life... but you don't.� You cared.� Troi cared.� I don't even like her, she's an obnoxious whiny inferior creature with positively awful telepathic morals, and okay, maybe she's not a rapist but I don't appreciate being telepathically cavity searched in public... but she cared."

"I do not believe it is accurate to state that I care," Data said.� "I believe you are using that expression to refer to the emotional state of concern for another person.� I do not have emotions."

"Bullshit."

Data blinked.� "What?"

"Hot, steaming piles of fecal matter from the intestinal tract of an ungelded male bovine... with diarrhea.� Data, stop saying you don't have emotions.� If you didn't you wouldn't be sentient.� It's the very definition of the word, you know.� Sapient means a thinking being; sentient means a feeling being.� The computer system here is sapient, but not sentient.� It has no cognizance that it has a self, or that that matters; it has no goals, no dreams, no plans.� It makes no value judgements.� It doesn't care.� You..."� He put a hand on Data's shoulder.� It was suddenly very important to him that this being stop belittling his own sentience.� "You decide what is good, and bad, based on your feelings on the matter, and then you act accordingly.� You make decisions.� You can't do that without emotions."

"I am programmed to make certain decisions."

"So is Picard.� It's just that neither you nor anyone on this ship can look at his source code.� Your programming is simpler than a human's, Data, and more elegant; they're largely programmed by genes and stochastic chance.� You were created by a sentient being, so you're not nearly as much of a giant mess as the creatures that just evolved randomly.� But it doesn't mean they're not programmed just because no sentient being wrote the code that runs them.� Besides, didn't you say you can violate your programming?"

"I can, if I wish... but I do not wish to, because violating my programming violates my sense of self.� It is by definition things that I do not do, because they are things I am programmed not to do."

"Yeah, you're not different from anyone else.� They don't do the things that they feel would be contradicting who they are, either.� Or they do do them, in which case they feel really bad about it, or convince themselves that they never actually did it.� If the reason you don't do certain things is that it makes you feel bad about yourself if you do them, then that's an emotion.� It doesn't matter that you can tell exactly what lines of Soong's code in your brain make you feel that emotion.� It's still an emotion.� How can you aspire to be more like a human if you have no emotions?� Aspiration's an emotion."

"Humans... have often expressed negativity toward the concept that the internal sensations I experience are in fact emotions.� I have learned that it is more accurate not to refer to them as such, because human emotions are far more complex and intense than the sensations I experience."

"If someone shot Picard, what would you experience?"

"I would be distressed.� I would wish to ensure that he receive immediate medical attention, and if he died I would experience a sense of loss."

"I hate to tell you this, but those are emotions."

"But I would not experience anger, or a desire for revenge against the person who shot him.� I would not be impaired by grief or worry.� Humans experience such things."

"And this makes you feel that humans are something you ought to aspire to be like, why?"

Data tilted his head, as if considering.� "I cannot tell you.� I only know that I believe myself to be incomplete.� You may be correct that the sensations I experience, the value judgements I make, are emotions, but they are not human emotions.� I cannot experience what it is to love, or hate.� I have never experienced anger, or fear.� I have also never experienced joy, or excitement.� My emotions, if they are in fact emotions, are probably most accurately characterized by expressions such as 'guilt', if I do something that violates my programming, or 'satisfaction', if I accomplish one of my goals.� I have experienced something like sorrow, I believe.� But there are many, many sensations that humans can experience, which I cannot, and I believe that I will be incomplete unless I become capable of experiencing and fully understanding such emotions."

"You're not missing out.� Really. You're not."

"I believe that I am.� I also believe that a being who has described himself as being 'miserable' is probably not an accurate judge of the value of human emotion."

Q laughed.� "Touch�, Data.� Score one for the android."� He sighed.� "You know what?� Fun as this is, I don't want to stand around discussing theory of mind and the philosophy of emotion with you all night.� It just keeps reminding me that I am miserable, and all of the absolutely excellent reasons I have to be miserable.� Isn't there anything fun we can do in this dump?� Something to take my mind off things?"

"There are many activities that humans aboard this ship consider to be fun.� What activities would interest you?"

"I don't know."� Q sighed again. "I was bored all the time before I lost my powers.� What do humans do when they're going to die in a few days?"

"Most humans who have the level of certainty you do of their own death do not have the freedom to do as they would like.� They are generally either prisoners awaiting execution, or they are gravely injured or ill."

"All right, fine.� What do humans do to take their minds off the fact that they're likely to die?� You can't tell me that doesn't happen all the time."

"Humans engage in various activities when they fear impending death.� Often, they prioritize spending time with loved ones when they believe they have little time left."

"That's not going to work for me."

"Indeed.� You do not have loved ones."

Stung, Q snapped, "What, just because they threw me out and left me here to die, you think I never loved any of them?"

Data blinked.� "I apologize.� For some reason, it had not occurred to me that the Q would be capable of love for each other."

"We're omnipotent.� How do you get the idea there's something we're not capable of?"

"I had been of the impression that the Q were above human emotion."

"Human emotion, yes.� We experience Q emotions.� Which are a lot less annoying, overall.� But, okay, fine, it doesn't matter if there are people that I care about that I wish I could see before I die, if none of them are here and they're not likely to come see me due to the aforementioned throwing me out and leaving me to my death.� I mean, any of them who actually feel anything for me more compassionate than gleeful schadenfreude are probably doing their best not to watch me right now.� They're certainly not going to show up here for a tender farewell."� He sighed.� "You got anything else?"

"While it is related to love, it is not identical, and can be conducted with strangers.� Humans often seek out sexual experiences when they believe they are going to die."

Q snorted.� "Are you offering?"

"Offering what?"

"Didn't think so.� It doesn't really do much good to point out to me that humans like to have sex if there isn't anyone around for me to do it with."

"Is that something you would be interested in?"

"Frankly, no.� I'm aware that humans seem to find the sweaty entanglement of their bodies remarkably entertaining, but I haven't the vaguest idea what they see in it, and now that I've actually had to use those organs for waste disposal, the entire concept seems positively nauseating."

Data nodded.� "Humans also seek out novel experiences that they have not previously had. For instance, humans who believe they have a terminal illness may choose to travel, or to engage in high risk activities such as feats of exploration."

"Can we limit this to the pool of things that's remotely reasonable for me to accomplish?"

"Perhaps you might find something to distract yourself on the holodeck. There are many experiences that the holodeck can simulate which you have probably not engaged in with a human body, which you might find pleasurable."

"Such as?"

"Skiing, swimming, snowboarding, whitewater rafting, canoeing, fishing, rock climbing, deep sea diving, gliding, powered ground vehicle racing, horseback riding, bungee jumping, sledding, tree climbing, spelunking, riding roller-"

Q cut Data off. "So in other words, the only thing humans find fun is being on a planet, or pretending they are, and moving around on its surface in some way or another?"

"No, there are many activities humans appear to enjoy, but I am limiting the list to the set they seem most willing to engage in alone. There are various games and sports humans play for enjoyment, but most require at least a partner if not a team."

Q sighed heavily. "None of this sounds remotely intriguing.� I don't want to spend what little time I have left learning some sort of physical skill I'll never have the opportunity to use in a real world context, and honestly I never found planets all that entertaining anyway, unless I was building one. Isn't there�"

Data's combadge bleeped. Data held up a hand in a "hold that thought" gesture. "Data."

LaForge's voice came over the combadge. "Data, is there any chance you can get free of Q yet?� The tidal disruptions have gotten really bad, so the Science Council's sending us someone, a Dr. Ese'ar, to help out with that. I could use you back here."

For a moment Q thought the person LaForge was referring to was for some inexplicable reason using an acronym from Earth Standard's alphabet, SAR, and given that this person was Bre'ella this made no sense whatsoever � Q himself might be using a letter from Earth Standard alphabet, but only because his true name was unpronounceable in any mortal language and so he always picked something relatively arbitrary as a translation. Well, his people's true name. Well, his former people's. "No, I have him chained up in a dungeon," he said into Data's combadge, sarcastically.

"That is an untruth," Data said, tilting his head slightly.� "Q, why are you attempting to mislead Commander LaForge?"

On the combadge, LaForge sighed.� "He's not, Data.� I asked if you could get free of him and he's being sarcastic about it. Look, you don't have to stay with him. Even if he's being a complete ass, he has got quarters he could stay in. It's not like anyone else is going to be stuck dealing with him if you don't."

"And what if I wanted to help?" Q asked, unable to keep the bitterness out of his voice.� LaForge obviously couldn't comprehend that Data might have any reason to stay with Q aside from keeping Q from annoying other people.

"Well, if you actually want to help and not throw another temper tantrum, you're welcome to come to Engineering too, but it's Data I really need. Data, the escort team should be bringing Dr. Ese'ar by in another few minutes; can you be here to meet her?" This time, Q heard the name clearly enough to guess at its construction. A woman's name, with the apostrophe construction that marked her clan position, like Bre'el itself. He couldn't remember what the names meant � his brother might have hijacked large tracts of this weak slab of meat that supported his mind with all his stupid updates about the Kaeloids, but the Bre'ella lived far enough from the Kaeloids that the updates hadn't included much about them.

"I can," Data said. He turned to Q.� "It is not necessary for you to come if you do not wish to, but if you do desire to help, your assistance will be welcome."

"It's not like I have anything better to do."