• Queen HawlSera
    link
    fedilink
    English
    122 months ago

    It’s almost like we can’t make a machine conscious until we know what makes a human conscious, and it’s obvious Emergentism is bullshit because making machines smarter doesn’t make them conscious

    Time to start listening to Roger Penrose’s Orch-OR theory as the evidence piles up - https://pubs.acs.org/doi/10.1021/acs.jpcb.3c07936

    • @blakestacey@awful.systems
      link
      fedilink
      English
      32
      edit-2
      2 months ago

      The given link contains exactly zero evidence in favor of Orchestrated Objective Reduction — “something interesting observed in vitro using UV spectroscopy” is a far cry from anything having biological relevance, let alone significance for understanding consciousness. And it’s not like Orch-OR deserves the lofty label of theory, anyway; it’s an ill-defined, under-specified, ad hoc proposal to throw out quantum mechanics and replace it with something else.

      The fact that programs built to do spicy autocomplete turn out to do spicy autocomplete has, as far as I can tell, zero implications for any theory of consciousness one way or the other.

    • @V0ldek@awful.systems
      link
      fedilink
      English
      112 months ago

      Orch-OR

      Never heard of this thing but just reading through the wiki

      An essential feature of Penrose’s theory is that the choice of states when objective reduction occurs is selected neither randomly (as are choices following wave function collapse) nor algorithmically. Rather, states are selected by a “non-computable” influence embedded in the Planck scale of spacetime geometry.

      Neither randomly nor alorithmically, rather magically. Like really, what the fuck else could you mean by “non-computable” in there that would be distinguishable from magic?

      Penrose claimed that such information is Platonic, representing pure mathematical truths, which relates to Penrose’s ideas concerning the three worlds: the physical, the mental, and the Platonic mathematical world. In Shadows of the Mind (1994), Penrose briefly indicates that this Platonic world could also include aesthetic and ethical values, but he does not commit to this further hypothesis.

      And this is just crankery with absolutely no mathematical meaning. Also pure mathematical truths are not outside of the physical world, what the fuck would that even mean bro.

      I thought Penrose was a smart physicist, the hell is he doing peddling this.

    • @V0ldek@awful.systems
      link
      fedilink
      English
      92 months ago

      and it’s obvious Emergentism is bullshit because making machines smarter doesn’t make them conscious

      This is like 101 of bad logic, “this sentence is false because I failed to prove it just now”.

    • @decivex@yiffit.net
      link
      fedilink
      English
      72 months ago

      Throwing out emergentism because some linear algebra failed to replicate it is a pretty bad take.

    • @frezik@midwest.social
      link
      fedilink
      English
      52 months ago

      You’re right that consciousness and intelligence are not the same. Our language tends to conflate the two.

      However, evolution created consciousness over billions of years by emergent factors and no source of specific direction besides being more successful at reproduction. We can likely get there orders of magnitude faster than evolution could. The big problem would be recognizing it for what it is when it’s here.

        • @Amoeba_Girl@awful.systems
          link
          fedilink
          English
          82 months ago

          I mean, assuming it is at all possible (or rather that the problem even means anything), I suppose four billion years is a rather generous deadline.

        • @WolfLink@lemmy.ml
          link
          fedilink
          English
          32 months ago

          If I practice trying to shoot hoops every day I’m going to get one in a lot sooner than you will just kicking at the ball every time you walk by.

          • Thwart Leader
            link
            fedilink
            82 months ago

            @WolfLink so you’re saying there’s a measurable correlation between practicing a skill and getting better at it? Amazing

            What’s this got to do with the Big Averaging Machine?