Thursday, July 10, 2014

"Subtleism" is a Useful Word

Allison Kaptur has written about the last of Hacker School's lightweight social rules: "No Subtle -isms":
Our last social rule, "No subtle -isms," bans subtle racism, sexism, homophobia, transphobia, and other kinds of bias. Like the first three rules, it's targeting subtle, accidental, mildly hurtful behavior. This rule isn't targeting slurs, harassment, or threats. These kinds of severe violations would have consequences, up to and including expelling someone from Hacker School. 
Breaking the fourth social rule, like breaking any other social rule, is an accident and a small thing. In theory, someone should be able to say "Hey, that was subtly sexist," get the response "Oops, sorry!" and move on just as easily as if they'd well-actually'ed. In practice, people are less likely to point out when this rule is broken, and more likely to be defensive if they were the rule-breaker. We'd like to change this.
When this was explained to me by Hacker School Co-Founder Sonali Sridhar, I thought it was brilliant, but I heard "subtle -ism" as a single word, "subtleism". "Subtleism" conveyed to me the concept that something could be harmless by itself, but multiplied by a thousand could be oppressive. So for example, using "you guys" for the second person plural when both men and women are included, is never meant to be sexist, and is rarely taken the wrong way. But an ocean of hundreds or even thousands of tiny, insignificant locutions like "you guys" can drown even a strong swimmer.

The reason subtleism is a useful word is that it can convey forgiveness in a context of working together to create a culture that is supportive of a diverse team. Reminding someone of a subtleism doesn't need to be a "shaming ritual"; after all, everyone uses subtleisms all the time. Compare the word "micro-aggression", which is used as an accusation or a lamentation.

Also, the word we should be using for the second person plural is "youse".

Sunday, June 29, 2014

Is Freemium Really Open Access?

Should the term "Open Access" be restricted to materials with licenses that allow redistribution, like Creative Commons licenses? Or, as some advocate, only materials that allow remixing and commercial re-use, like CC-BY and CC-BY-SA?

I had lunch today with folks from OpenEditions, a French publishing organization whose ebook effort I've been admiring for a while. They're here in Las Vegas at the American Library Association Conference, hoping to get libraries interested in the 1,428 ebooks they have on their platform. (Booth 1437!)

Of those 1,076 books are in a program they call "Open Access Fremium". With these books, you can read them on the OpenEditions website for free, without having to register or anything. You can even embed them into your blog. So for example, here's Opinion Mining et Sentiment Analysis by Dominique Boullier and Audrey Lohard:



So is it OpenAccess™?

In this freemium model, the main product that's being sold is access to the downloadable ebook- whether PDF or EPUB. For libraries, a subscription allows for unlimited access with IP address authentication along with additional services. Creative Commons licenses, all of which allow for format conversion, wouldn't work for this business model because the free HTML could easily be converted into EPUB and PDF. They have their own license, you can read it here.

This is clearly not completely open, but there's no doubt that it's usefully open. For me, the biggest problem is that if OpenEditions goes away for some reason- business, politics, natural disaster, or stupidity, then the ebooks disappear. Similarly, if OpenEditions policies change or urls move, they could break the embed.

On the plus side, OpenEditions have convinced a group of normally conservative publishers of the advantages of creating usefully open versions of over a thousand books. It's a step in the right direction.


Saturday, June 28, 2014

Overdrive is Making My Crazy Dream Come True

Fifteen years ago, I had this crazy dream. I imagined that popular websites would use fancy links to let their readers get books from their local libraries. And that search engines would prefer these links because their users would love to have access to their library books. I built a linking technology and tried to get people to use it. It never took off. I went on to do other things, but it was a good dream.

Tonight, at the opening of the exhibits at the American Library Association in Las Vegas, Steve Potash, the Founder of Overdrive, pulled me aside and said he had something cool to show me.
  1. Go search for a popular book on Bing or try this one.
  2. Notice the infobox on the right. Look at the Read This Book link. Click it.
  3. Now check out this Huffington Post article. Note the embedded book sample.
  4. If the Overdrive system recognizes you, it's taken you to your library's overdrive collection. If not, when you click "Borrow" you get a list of Overdrive libraries near you.
It's an embed from Overdrive. Even works here:

Wow!

The read on site thing works sometimes and doesn't work sometimes, so there are still a bunch of kinks for Overdrive to work out. But that's not really the point.

The reason why the Huffposts and Buzzfeeds of the world like this is not so much the customization of a link, which is what I was trying to sell, but rather the fact the the book is embedded on the host web site. This keeps people on their site longer, and they click more ads.

Embeds are the magic of the day. You heard it here first.

Speaking of dreams, I'm having a hard time in Vegas figuring out what's real and what isn't.

Friday, June 20, 2014

One Human Brain = 8 Global Internets (Total Data Rate)

It's a trope of science fiction movies. Somebody gets the content of their brain sucked out and transferred to some other brain over a wire. So how much data would that be?

http://arxiv.org/abs/1307.2196
This question occurred to me a while ago when I attended "Downloading the Brain",  a panel discussion hosted by the World Science Festival. Michel Maharbiz of UC Berkeley talked about the "Neural Dust"  his group is developing. (Apparently the paper with their first results is in the review process.) The "Dust" is a new way to monitor neurons, without needing a wire. A tiny CMOS silicon chip just a hundred microns or so is place on the neuron. On the chip is a piezoelectric resonator. An acoustic wave is used to power and interrogate the chip, and thus read out the electric fields in the neuron. It's very similar to the way that RFID chips are interrogated by radio waves to do things like locating books in a library or trucks on a highway.


So imagine you could use this neural dust, one mote per neuron, to "read out" a brain. How much data would this be?

Get out your envelope backs!

http://dx.doi.org/10.3389/neuro.11.002.2010
The cerebral cortex of a typical human brain has about 20 billion neurons.  Which is a lot of neurons, so forget about dusting them. But pretend you could. I think ten thousand bytes per second ought to be enough to capture the output of a neuron. A single neuron isn't that fast- it fires at most 200 times a second. But we don't really know how much information is in the precise timing of the pulses, or even if the magnitude of a pulse encodes additional information. (There are a HUGE number of things we don't know about how the brain works!)  So 200 trillion bytes per second, or 1.6 petabits per second should be roughly enough to transmit the complete state of a brain. (Not including the connection patterns. That's a whole 'nother envelope!)


How much is that?

It's about 1 optical fiber. The maximum conceivable bandwidth of a single-mode optical fiber is set by the frequency range where it's clear enough to transmit light without melting. That number is about 1 petabit (10^15)/s, depending on the transmission distance. (see this discussion on StackExchange). Bit rates of about 10% of that have been achieved in the laboratory using "Space Division Multiplexing" (see this news article or this keynote from a real expert), while the current generation of optical networking products use multiple channels of 100 Gigabit Ethernet to achieve as much as 10Tb/s on a fiber, about 1% of the theoretical limit. A petabit per second is a ways in the future, but so is our neuro-dust.) Even now, we could probably fit a brain dump on 16 of the laboratory fiber systems.

So we can imagine putting the bandwidth of a brain onto a cable we can hold in our hands.

But how much is THAT?

Cisco puts out a report every year estimating the total traffic on the internet.  This year, they're estimating that the total IP traffic in the world is 62,476 petabytes per month. That's about 190 terabits/second. So a brain readout would be about 8 times the internet's total data rate.

Right now, powering that much dust would be impractical. Currently a neural dust mote uses a half a milliwatt of power, which means 10 megawatts to read the whole brain. So it gets fried, but it was just watching Netflix.


Saturday, May 24, 2014

The Future of the Book is Unfinished: John Sundman's "Biodigital"

It used to be that a book was finished. Set the lead type and that was the book, for better or for worse. In some ways that's a virtue- the authors' pregnancy was finite; the labored give and take with publisher and editor would result in a pretty package of ink on bound paper. But at the same time it's a liability. Non-fiction books become obsolete as time leaves them behind. The artistic process isn't neat and clean. For great works of literature, generations of graduate students pore over notebooks, letters and ephemera to try to figure out what the great artist really meant, maybe it was just a big fish?

I find that the most interesting things going on in the ebook world now are being done by people who see books as continuing processes that need not be contained within EPUBs or frozen into PDFs. I notice that these creations fit poorly into today's book publishing machine. Formats go flat, conventional copyrights do copy wrongs; ISBNs go bonkers, bookstores start selling teddy bears and libraries look the other way.

Available at Unglue.it
Which brings me to John Sundman's Biodigital. Oh my god it was good.

As a reader, I found it profoundly disturbing. Disturbing the same way I felt the first time I experienced an earthquake. Having grown up on the east coast, earthquakes were abstractions to me. On moving to Palo Alto after college to take a job at Intel, earthquakes became something we joked about in the fab as we heated silicon wafers to 1200°C inside monstrous quartz tubes. I vaguely thought it would be fun to feel the earth shake. The next year I was a graduate student at Stanford and I felt my first real quake, the one centered in Coalinga. At first it was exciting, but then, as the ground started to roll, I began to worry if it was going to stop. When it was over, my cognitive relationship with the ground had changed. I had never doubted its solidity; suddenly I knew different.

Books aren't carved in stone any more. They are mutable. There, I've ruined them all for you. And there's a deadly earthquake in Biodigital. Whoops, I spoiled that one for you too.

Biodigital is a remix. About 60% of it came from Sundman's earlier novel, Acts of the Apostles, or so he tells us. 40% of Biodigital is new. I've not yet read that Acts, which makes me one of a very small number of people who have read Biodigital first. (I'll report back after reading Acts.) That subversive knowledge nagged at me through the whole book. "Was this chapter newly written, or was it 'original'?" Also, is the reader meant to know the book is remixed? What's supposed to be real in the book? There's a fictional corporate lab, Emverk, in Biodigital that's clearly supposed to be Xerox PARC, but does that make the fictional company real? Why do I care?

You may not share my paranoia of fictional reality if you read Biodigital. Because the reality is extremely vivid and fast-faced. At one point I had the notion that the book was written expressly for me, with inserted references to places I've been, things I've done, and people I've met. I've stood on a ridge on Skyline Drive, I've pored over chip schematics looking for the misplaced hunk of poly causing the glitch on the scope trace; and I've met that crazy guy at the bar in Antonio's Nut House. Somehow I missed that Sundman was there, taking notes. But by the end of the book, things become surreal, dead people start chasing you, and you don't know anymore whether the aromas you were smelling from Peking Garden existed at all.

Acts of the Apostles never really found its place in the publishing pantheon. It was Sundman's first novel. And since Sundman worked in technical documentation in the milieu of the pre-web internet, publishing it himself seemed natural. Soon after the licenses were introduced in 2003, Sundman's friend Cory Doctorow convinced him to adopt Creative Commons for his novels, so Acts was just the second Creative Commons novel ever. Slashdot reviewed it and it became a hackerish cult phenomenon,  even outselling Dan Clancy, Michael Crichton and Stephen King - for a few hours - on Amazon. Two sequels, Cheap Complex Devices, and The Pains followed Acts.

But Sundman still wanted a real publisher and the audience a real publisher can connect to. And after many rejections, he finally found a small publishing house that was doing some great things and he managed to get the publisher interested. As Sundman recounts, she
offered to hire an editor, at her expense, to read Acts, write an analysis, and make suggestions for improving it. So I said, “fine”, and she did so, and a few weeks later she sent me the result, and I had to agree that the outside editor had spotted the weak spots in the book and made reasonable suggestions for improving it. The suggestions were basically for fine-tuning the book that’s already written, not for a wholesale rewrite.
and so the rights to Acts were sold, and Sundman began working on the book that would become Biodigital. Remixing the raw material in Acts, if you will.

Long story short, the indy publisher was sold to another publisher, and the rights to Acts/Biodigital were reverted. So now what to do? How do you go about selling a book that's a remix of another book that's been free? From the buyer's point of view it's very confusing. Which book should be read first? Is Biodigital supposed to replace Acts as the first book in the series? If you've read Acts, do you really want to read Biodigital? From the bookseller's point of view, who's the audience- people who loved Acts?

Unglue.it's "Buy to Unglue" program was a good fit for the book. It uses a dated Creative Commons license on the books it sells. So on April 27, 2016 or sooner, depending on sales, Biodigital gets a CC BY-SA license. That means that Sundman isn't the only one who gets to remix the book. You can rewrite it to Pseudo-BioDigital if you want, and release it yourself under the same license, as long as you credit Sundman. It's a "Free Culture" license (albeit not yet) that allows the book to be never finished.

Biodigital is a novel of technopotheosis. Google that word, by the way. It's the process of humans merging with technology to become gods. But don't get the wrong impression. Biodigital isn't about technopotheosis.  It's about the reactions of people to the way technology changes us. One reaction is to decide it's fictional. Another is to be scared. And a third is to become a god. Really, we're all choosing, one way or another.

So we're merging real technology with real books to make them give them new life, giving them immortality. Bibliopotheosis?