Amazon Echo | Hacker News

The doorbell rings. Young daughter answers. Nobody is there. She looks down. There’s a package. From Amazon . . .

Be sure to make the video on the left fullscreen after clicking this link. Take note if the simple, aural addition of strings is enough to evoke an emotional response in you. How scary is this on a scale from 1 to 10 (where 10 is terrifying)?

Remember, this product is real.

via Amazon Echo | Hacker News.

INFORMATION DOESN’T WANT TO BE FREE by Cory Doctorow | Kirkus

I’m excited to find out about this book. I wasn’t aware that Cory was working on something non-fiction. The second quote below defines the internet as “one great big copy machine” which is amusingly accurate. I had the opportunity to ask an interviewee for a one-word definition of the internet. Her response was the word “open” followed by a string of warm musings about sharing and connecting directly with others to exchange everything from ideas to art. “Copies” wouldn’t be a bad definition either, though I sincerely hope we can re-position that term as a positive one.

The first quote from Palmer and Gaiman is a belief that is widely shared on the web. I wonder if it is a belief or a truth, because  these industries still exist while producing and distributing content. It just feels like their business model has shifted away from the exchange of content and money when they fought so hard against the web. Things like Patreon are fascinating examples of alternative means to make the exchange more meaningful and direct.

“We are a new generation of artists, makers, supporters, and consumers who believe that the old system through which we exchanged content and money is dead. Not dying: dead.”

Instead, the author advocates for a liberalized system of copyright laws that finally admits that the Internet, for all its virtues and diverse purposes, is nothing but one great big copy machine, and it’s not going away.

via INFORMATION DOESN’T WANT TO BE FREE by Cory Doctorow | Kirkus.

Had an idea for a thing. At the moment I’m calling it <broken poets>.


Clark was having one of his moments.

"There were roughly three thousand people using public transport that day; thirty-four where riding on the bus in question; eighteen were Caucasian, twelve were Hispanic, two were black, two were Asian; nine were wearing hats: three where dark, six were light; one of the Hispanic women was wearing gloves – it wasn't cold that day; three of the Caucasian women were wearing winter coats – it was not cold that day; and one of the Caucasian men noticed this disparity in dress before the crash that killed thirty-three of them – I should not have questioned them about their seasonally inappropriate attire so close to the end of their mortal lives."

The department store worker asked again, "do you need this gift wrapped, sir?"

"That would be nice. I could give it to someone if I chose to. It's always good to have your options open. You have a very nice choice of attire that is seasonally appropriate."

"Thank you. Just one final ribbon –"

"Tremendous! I can tell you're going to live for a long time. God doesn't take the talented or well-dressed."

Superintelligence

This article pointed me toward the book Superintelligence which pointed me toward a quote from a shorter document by the author from which I’ve quoted below. Anyone wondering what to gift me this Christmas can look into the linked book above. I enjoy being terrified by other people’s thoughts.

Two things are striking from the below quote:

  1. The future of humanity could be decided by algorithms—iterated through countless other machine-iterated algorithms—beginning with something being coded today (hopefully without any bugs or typos).
  2. The implication that superintelligence would eradicate human invention.

The first point is terrifying. I’d like to believe that if such a superintelligence is brought forth it would be smart enough to fix any bugs or major design flaws in the original. Of course I assume that what a superintelligence wants and what mere human intelligences want will differ in profound ways. What then?

The second point is—I believe—wrong (assuming we’re using the word ‘invention’ similarly). Unless humanity has been exterminated by this superintelligence then invention will not cease. The more fictional forms of invention (e.g. art) should flourish. I strongly believe that humans are a necessary component in art. Creation, reception, critique, categorization, and other components require human beings.

Superintelligence, if/when it materializes, will spur a Renaissance in human artistic production.

<that’s what I think anyway>

Superintelligence would be the last invention biological man would ever need to make, since, by definition, it would be much better at inventing than we are. All sorts of theoretically possible technologies could be developed quickly by superintelligence — advanced molecular manufacturing, medical nanotechnology, human enhancement technologies, uploading, weapons of all kinds, lifelike virtual realities, self‐replicating space‐colonizing robotic probes, and more. It would also be super‐effective at creating plans and strategies, working out philosophical problems, persuading and manipulating, and much else beside.

It is an open question whether the consequences would be for the better or the worse. The potential upside is clearly enormous; but the downside includes existential risk. Humanityʹs future might one day depend on the initial conditions we create, in particular on whether we successfully design the system (e.g., the seed AIʹs goal architecture) in such a way as to make it ʺhuman‐friendlyʺ — in the best possible interpretation of that term.

via Nick Bostrom.