Platforms

I got used to things going pretty well, going well enough. It was a matter of standing on a steady stone, moving to one higher or broader when the opportunity presented.

I told myself to appreciate them as platforms, thinking of these situations as being good places to be. I learned at the beginning that not to decide is to decide and not to do is to do.

I’ve made some observations that may have some value, but mostly I’ve read the words of others. Let’s see if I’ve gotten it right.

In Defense of Prediction

It never bothers me that people want to know the future. Anybody would want to know what’s going to happen next week or next year. In the grand scheme of things, it really doesn’t matter whether it’s going to be raining tomorrow morning, but I would like to enjoy a run on a crisp, sunny fall morning. Is that run going to prolong my life? The longevity effects of running are of a bit more import and might change what I do tomorrow even if the weather is fine. Please, if I could know which way Apple’s stock price was headed over the next 5 years, I’d be able to make a good bit of money. I don’t need to know the exact price, mind you. I’d settle for knowing if it will be higher or lower and by about how much.

We just can’t know the future. I argue strongly that we don’t even know the present. We don’t even have a very sound understanding of the past. Our world is too big and too complex to comprehend thoroughly. The present and the past are here or were here and are fixed. The future hasn’t happened yet. Critically, our intuitive notion of free will convinces us that the future is not determined and can be changed by our choices.

Now if you believe in a clockwork universe, where each present state determines the next state mechanically and without uncertainty, stop reading here please. You have no choice, the future is already determined and you’ll gain nothing from understanding how to decide better. I don’t even know why I’d tell anyone living in a clockwork universe to stop reading in the first place. After all, they lack the ability to choose whether or not to continue anyway.

If you believe in free will and a truly uncertain future that can’t be known, then you’d a prediction of tomorrow’s weather or next year’s stock prices. I’ll continue to reference the National Weather Service forecast and consider the estimates of Apple’s 5 year sales and earnings growth rates.

It never bothers me that we all want to know the future. We realize that we can’t know. What bothers me is when I hear disappointment that predictions about the future turn out to be wrong. “They predicted an inch of snow and we ended up with six!” or “They predicted inflation and rising gold prices and instead gold fell and the price of corn and cotton skyrocketed!” We wanted the prediction, acted on it and then feel betrayed when it’s wrong.

Why predict the future when we know perfectly well it can’t be known? Even when the prediction is right, maybe it was a lucky guess. After all, even the broken clock tells the right time twice a day.

We can’t know the future but we can predict it. These two very important English words, know and predict, deserve closer examination. The space between them reveals so much of the mystery of how we live in an uncertain universe.

Know and predict are verbs. They are things to do. Since one can only do things in the moment, they can’t be done in the future or the past. Of course our language lets us talk about future and past events as if they are happening now, saying “I will know” or “I knew” are just shifts in outlook, not an ability to actuall do something now that occurs in the future or the past. The flow of time constrains us to living and acting moment to moment.

At present, only people know and predict. In speech, we’ll often ascribe the doing to an object, but its a convenience. The thermometer knows the temperature. The crystal ball predicts the winner of the Worlds Series. The thermometer and the crystal ball can’t really know or predict. A person knows the temperature when he or she reads the indication and a person uses the crystal ball to predict the future.

To know implies certainty. There’s only one temperature to be read off of the thermometer. If there is any doubt as to the true state of the world, then believe becomes the appropriate verb. One can argue whether knowledge can be certain, but it seems quite clear to me that if the belief is strong enough, we act as if it is the truth. To know the truth is to admit no doubt.

To predict is to make statements about the future. Is a prediction somehow a claim of knowledge of the future? What does it mean to predict the winner of the world series or the winner of the next Presidential election?

In anything but a trivial case, statements about the future have to admit at least some doubt. Most of the time, they ought to come along with a truckload of disclaimers. And since the future is not determined by present conditions, free will can overturn any prediciton that allow for human action.

Those who claim to know can be proved right or wrong. A prediction can’t ever be right or wrong, being a statement of belief, of future probability. Can I predict the result of your next toss of a coin? If I predict heads, I’ll be wrong half the time. Yet I’ll confidently predict that over a large number of coin flips, the result will be close to 5o-50. The Law of Large Numbers lets me refine the accuracy of my prediction based on the number of coin flips. But the next result is not knowable by man or machine.

Does anyone really want to hear pundits say they believe who the winner of an election will be? That the mathematical model reveals that 4 out of 5 times tomorrow morning will be sunny and dry? Since those future events will occur only one time and we want to know the future, we demand predictions that have to be wrong in some or even most cases. So that we can pretend to know the future.

Experiencing the Brain at the Society for Neuroscience

It was great to spend a few days this week at the Society for Neuroscience Annual Meeting in Washington, DC. I was invited to a reunion of the Coyle Lab, folks who worked with Joe Coyle over the years. It was great to revisit scientific interests at the meeting and then renew friendships with great people who I see very rarely. After all, I live in the world of drug development and clinical trials which intersects with neuroscience only around areas of disease targets or drug mechanisms.

One of the fun aspects of the Coyle Lab reunion was acting as the night’s photographer. I stay away from event photography in general, having had some bad experiences both in photographing and being photographed. Thanks to the automation of the Nikon Creative Flash System, I was able to use off camera flash to more less consistently light a cavernous atrium room. I’m not sure I would volunteer again soon without some practice and learning of technique. Two lessons- Just like any other type of image capture, event photos need a subject, generally something happening of interest. Second, making something happen- like suggesting posing an groupings facilitates interesting interactions that can be the subject of photographs.

As for the meeting, the broad sweep of modern neuroscience is breathtaking. Elegant, in depth work goes on studying every level of organization from molecular, to cellular to structural to cognitive. As always, gaining the larger synthetic view is our ongoing challenge. A system is isolated, a factor is manipulated, a change is reported. Evidence is collected resulting in masses of data no single scientist could ever assimilate. For a bystander like myself, I feel the significance each scientist sees in their own set of experiments, contributing to a larger effort, but I can’t see the larger picture. Attending symposium talks where more senior scientists present their stories, wider context and significance are assumed or dismissed with a sentence or two.

I can’t help but think that we are doing something fundamentally wrong in building this scientific edifice. Findings get reported as news but are quickly forgotten, changing very little. While the money invested in biomedical science has grown exponentially, the yield of new drugs has been completely flat for 40 years now. As much as we’re able to watch cognition occurring in the brain, the way most of us experience our own thoughts, the actions of our own brains, is still mostly a Cartesian Dualism. We believe we are rational, abstract souls inhabiting bodies rather than understanding how mind is our subjective experience of our brains actions.

Duomo At Night



Duomo At Night, originally uploaded by jjvornov.

It’s been a long time since I simply converted an image from RAW and uploaded it to Flickr and ODB. This image, taken in Florence Italy a couple of nights ago, is about the light that was really there, not my interpretation of it. At some point, I’ll see what adding some Oz to the image does, but it seemed so perfect right out of the camera.

Was This Steve Jobs’ Worst Decision?

Is it possible that Steve Jobs made a poor choice that ultimately cost him his life?

I had no idea that Jobs delayed his potentially curative surgery after initial diagnosis to try to cure his cancer “with a special diet”.

Talking Business – Apple’s Culture of Secrecy – NYTimes.com: “It was an uplifting tale, and an inspiring message. It was also less than the whole truth. In fact, Mr. Jobs first discovered he had an islet cell neuroendocrine tumor — which is both rarer and less deadly than other forms of pancreatic cancer — in October 2003. This was a full nine months before he had the surgery to remove it. Why did he wait so long? Because, according to a Fortune magazine article published in May, Mr. Jobs was hoping to beat the cancer with a special diet.”

My surgeon friends love to say “To cut is to cure”. It’s very true in the case of curing cancer by surgical removal. I have a friend who beat non-small cell lung cancer because she saw a trailer offering screening chest X-Rays. As a smoker, it seemed like a good idea. Within days her cancer was operated on for a cure.

I criticize our biomedical and healthcare industry plenty. Profit seeking and defensiveness cost us money. But I worry about how alternatives are positioned when they delay timely and potentially life saving treatment. Interesting that an intellect as vigorous as Steve Jobs’ apparently can make a choice like that.

I’ll be interested to see whether the authorized bio coming out deals with the delay.

Why did Netflix spin off it’s DVD business?

When I heard about Netflix separating its DVD out, I wondered why this they were doing to customers like me. It’s more expensive and less convenient.

Here’s the deal. Netflix killed Blockbuster and the brick and mortar DVD rental business. I was glad to be rid of the late fees and the errands. One by one, the Blockbusters and Hollywood stores closed.

At this point Netflix has to be one of the biggest buyers of back catalogue DVDs. Sure there’s a section of DVDs in the bookstores and Walmart, but its a collection of low cost evergreen movies and collectors items. Eventually DVDs and DVDs will disappear more or less the way CDs have.

By now owning DVD rental and then killing it, they gain leverage over the studios in pushing the entire catalog to streaming as has already happened in music.

Netflix knows that disc storage will be as niche as CDs in the future and wants to own the next distribution system. Netflix wants to force the future in the way that Apple forced the future in music with iTunes.

My First Earthquake

I was sitting at the kitchen table having just finished lunch. A sliced banana with some Greek yogurt spooned over it. I felt our big orange cat pushing against my chair. Odd that when I looked down, I couldn’t see him. He kept pushing against the chair even though he wasn’t there.

A few seconds later, the washing machine was unbalanced. Except for some reason it wasn’t making any sound, just shaking the whole darn house. By this time everyone was trying to figure out how the washing machine could shake the whole house even when it wasn’t even running. Must be something else! People on the roof? A tree falling and pushing against the house? Was it about to topple over?

An earthquake? Yeah right, an earthquake in Baltimore. Well at least it made sense even though it was almost impossible.

Flickr and Google+

Street Scene

I have a photo workflow that’s been pretty stable for the past few years. Images are ingested from cards to Aperture which is the image database. Image processing is mostly done in Photoshop using NIK filters. On saving, the PSD image is saved in Aperture and a copy exported to Flickr with the FlickrExport plugin.

Flickr is great as a place to have images seen and to look at other images. I don’t spend as much time on the social aspects of Flickr as I have in the past, mostly because I’m not as engaged in photography as I am in other activities these days. Even without the social aspect of Contacts and Groups of Flickr, its still the image repository of choice for public display of images, supporting EXIF browsing and searching.

I also love how easy it is to post images to ODB, either directly from Flickr or the media manager in MarsEdit where this post was created. In this workflow, each tool can actually take care of multiple steps, but I tend to take the approach of passing the image and text from app to app based on where I prefer to perforrm each step. I could, for example, ingest and adjust in Aperture, but I like the layer control of Photoshop. Once the image is in Flickr, I can write and post to the blog, but I like the text and preview available with MarsEdit and an external editor.

I’m enjoying the early days with Google+. Uploading iPhone images works well with location tagging and ability to comment, so that’s a great venue for casual photography with the phone, my connected camera where images are spared my tortuous post-processing. I’ve uploaded a few of my processed images directly to Google+, realizing now they’re in Picasa albums at Google. It seems unnecessary to have duplicates there, so it may be best to link to Flickr or this weblog in the case of longer posts like this.