{"id":2872,"date":"2016-01-30T16:17:23","date_gmt":"2016-01-30T23:17:23","guid":{"rendered":"http:\/\/www.sheer.us\/weblogs\/?p=2872"},"modified":"2016-01-30T16:17:34","modified_gmt":"2016-01-30T23:17:34","slug":"rights-for-electronic-life","status":"publish","type":"post","link":"https:\/\/www.sheer.us\/weblogs\/nnn\/rights-for-electronic-life","title":{"rendered":"Rights for electronic life"},"content":{"rendered":"<p>So, recently I ran across <A HREF=https:\/\/en.wikipedia.org\/wiki\/SyNAPSE>this<\/A>.<\/p>\n<p>My first reaction was, holy shmoo, the singularity is almost here! <\/p>\n<p>Actually, there&#8217;s all kinds of interesting problems here. I&#8217;ve talked with a number of my friends about the question of whether, if we created a accurate software model of a human, it would exhibit free will. It&#8217;s a really interesting question &#8211; if the answer is yes, that&#8217;s a serious blow to theology but a major boost to the rest of us. <\/p>\n<p>But there&#8217;s a natural side question which comes up &#8211; which is, supposing we can get the neuron count up from a million to a billion per chip. If moore&#8217;s law were to hold, this would take &#8211; let&#8217;s see, 1, 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024 = 11 18-month cycles. At that point, making a 100-billion neuron mind out of the chips becomes practical. Said creature has as many neurons as we do &#8211; but is it a person?<\/p>\n<p>My guess is, legally, initially, no. In fact, we&#8217;ll probably see all sorts of awful behavior as we debug, including repeatedly murdering the poor thing (turning off the power, over and over).<\/p>\n<p>We may even see them turned into slaves, although I really hope we&#8217;re beyond that by now. I don&#8217;t mind enslaving small neural nets that will never show free will or understand suffering, or enslaving turing machines which are incapable of a original thought, but the idea of enslaving something that&#8217;s as capable as we are is disturbing. <\/p>\n<p>At some point, however, we&#8217;ll have to acknowledge that a person&#8217;s a person, no matter what they&#8217;re made of. I see signs we&#8217;re moving in this direction with India granting personhood to dolphins (about bloody time!) and I have hopes to someday see it granted to any individual who can pass the mirror test. (If you know you&#8217;re a person, then you are)<\/p>\n<p>It does remind me of &#8220;Jerry was a man&#8221;. It&#8217;s a question we&#8217;ll have to wrestle with &#8211; I hope we haven&#8217;t gotten so locked into the idea that electrons just do what we tell them to with turing machines (where that&#8217;s true) that we can&#8217;t realize that if we build a sufficiently large neural network out of transistors, it has the same rights that we do &#8211; in fact, &#8216;birthing&#8217; might be a better phrase than &#8216;building&#8217; here, since we are undoubtedly creating a new life form.<\/p>\n<p>There&#8217;s all sorts of interesting corollaries to this as well. If we succeed in building something self-aware out of transistors, our race will be experiencing first contact. Granted, we&#8217;ll have *built* ET instead of met him out there in the sky, but that doesn&#8217;t change the fact that it is first contact. A life form made out of silicon is likely to be *different* &#8211; have different values, enjoy different things. This has been explored quite a bit in science fiction, but it was completely news to me that I was going to see it in my lifetime (assuming the actuarial tables describe me) as science fact.<\/p>\n<p>If we build something 100 billion neurons in size and it&#8217;s *not* self-aware, this also has interesting implications &#8211; it asks the question &#8220;Where is the magic coming from?&#8221;. This outcome would also be incredibly cool, and lead us off in another, equally interesting set of adventures.<\/p>\n<p>There&#8217;s also the question of the singularity &#8211; what happens when we build something with 200 billion neurons? There&#8217;s another article I keep meaning to write about intelligence and stability, but one interesting thing I would note is that plus or minus a few percent, all humans have the same 100 billion neurons, therefore increased intelligence or performance in our minds comes from changing the way we connect them. It&#8217;s possible that a larger neural net won&#8217;t be more intelligent at all &#8211; or that it will be completely unstable &#8211; or that it will be much, much, *much* more intelligent. All of us are going to be curious about what it has to say, in the latter case, and in any case we&#8217;re going to learn a lot of interesting things.<\/p>\n<p>However, I do think we should all sit down and talk about the ethical issues *before* we build something that should have legal rights. I think we probably will &#8211; this has been addressed in numerous forums so it&#8217;s undoubtedly something people are aware of. One of my favorite Star Trek themes, addressed numerous times in TNG.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>So, recently I ran across this. My first reaction was, holy shmoo, the singularity is almost here! Actually, there&#8217;s all kinds of interesting problems here. I&#8217;ve talked with a number of my friends about the question of whether, if we created a accurate software model of a human, it would exhibit free will. It&#8217;s a [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[12,16],"tags":[],"_links":{"self":[{"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/posts\/2872"}],"collection":[{"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/comments?post=2872"}],"version-history":[{"count":1,"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/posts\/2872\/revisions"}],"predecessor-version":[{"id":2873,"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/posts\/2872\/revisions\/2873"}],"wp:attachment":[{"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/media?parent=2872"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/categories?post=2872"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.sheer.us\/weblogs\/wp-json\/wp\/v2\/tags?post=2872"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}