this post was submitted on 04 Sep 2024
182 points (96.0% liked)

Technology

59055 readers
3164 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Worm's brain mapped and replicated digitally to control obstacle-avoiding robot.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 month ago* (last edited 1 month ago) (24 children)

You're coming at this from a slightly askew angle. Consciousness is holographic - that is, it's complex behavior arising in the interaction of a more complex system. There's nothing "more" to it than what we see. The transporters from startrek, which destroy then reproduce exactly, would change nothing about your experience. You're just a complex arrangement of atoms, and it doesnt matter where that arrangement occurs so long as it's unique. There is no "you", there's just "stuff" stuck together in a way that lets it think that it can think about itself. A perfect reproduction would result in the same entity, perfectly reproduced.

[–] [email protected] 7 points 1 month ago (15 children)

A perfect reproduction would result in the same entity, perfectly reproduced.

It would, but I remain convinced that the continuity of my experience would end, same as if I died, and the entity who came out the other side would believe itself to be me, and believe itself to be unscathed, but actually exist only until the next time it got into a transporter, when the cycle would happen again.

[–] [email protected] 7 points 1 month ago (14 children)

continuity of my experience would end

why? what property is altered that would 'end continuity'? kinda just sounds like a personal delineation.. a personal preference. like being annoyed at being 'interrupted'.

[–] [email protected] 5 points 1 month ago (1 children)

Think of an alternative scenario, not transportation but rather duplication. The original stays where it was, but a copy gets created elsewhere. To the copy, it will seem as if it got transported there. To the original, nothing will have happened.

Now you kill the original.

The only difference is the timing of ending the original.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

Ah, but it wouldn't be a copy of the original. In a hypothetical star-trek transporter accident that results in a duplicate, there would be an instant of creation where the dupe and original would be truly identical - and then the question would be which one of those two is 'you'? They'd be experiencing identical things, so how could you tell them apart? What would even be the point, they're identical, there is by definition no difference between them. The differences would only come once the duplicate is exposed to different (for lack of a better term) 'external stimuli' than the original, like different angles of seeing the transporter room or the sensation of suddenly and rapidly growing a goatee. Your perception wouldn't continue with the duplicate because your experience would be different than that of the duplicate's (for example, you wouldn't have mysteriously grown a goatee).

If you destroyed the original and then made the duplicate, it would start at that moment of total equivalence, but there would be no deviation. There'd just be one version, that was identical to the original, moving forward through time. 'You' would just go on continuing to be you. Consciousness isn't a 'thing' - it's not magic, its just a weird state that arises in sufficiently complex systems. You and I and everyone else in this thread aren't special, we're just extremely densely packed networks that have the ability to refer to themselves as an abstract concept.

It's a similar thing to the classic "But how do I know that what I see as the color green is what you see as the color green" question. The answer is that the "color green" that we see isn't real, 'green' is just a nerve impulse that hits a network. Each photoreceptor just sends a signal. If we were computers the world would be represented as an array of values, which results in the much clearer "How do I know what I see as G_101.34 is what you see as G_101.34" just isn't quite as punchy a question.

load more comments (12 replies)
load more comments (12 replies)
load more comments (20 replies)