Selection of the week

October 10, 2014

Here is the iconic soprano Maria Callas singing Puccini’s aria “O mio babbino caro” from the opera Gianni Schicchi. It was also used in the 1985 film “A Room with a View.”

Nobel Prize in Physiology or Medicine

October 6, 2014

The Nobel Prize for Physiology or Medicine was awarded this morning to John O’Keefe and May-Brit Moser and Edward Moser for the discovery of place cells and grid cells, respectively. O’Keefe discovered in 1971 that there were cells in the hippocampus that fired when a rat was in a certain location. He called these place cells and a whole generation of scientists, including the Mosers, have been studying them ever since then. In 2005, the Mosers discovered grid cells in the entorhinal cortex, which feed into the hippocampus. Grid cells fire whenever rats pass through periodically spaced intervals in a given area such as a room, dividing the room into a triangular lattice. Different grid cells have different frequencies, phases and orientations.

For humans, the hippocampus is an area of the brain known to be associated with memory formation. Much of what we know about the hippocampus in humans was learned by studying Henry Molaison, known as H.M. in the scientific literature, who had both of his hippocampi removed as a young man because of severe epileptic fits. H.M. could carry on a conversation but could not remember any of it if he was distracted. He had to be re-introduced to the medical staff that treated and observed him every day. H.M. showed us that memory comes in at least three forms. There is very short term or working memory, necessary to carry a conversation or remember a phone number long enough to dial it. Then there is long term explicit or declarative memory for which the hippocampus is essential. This is the memory of episodic events in your life and random learned facts about the world. People without a hippocampus, as depicted in the film Memento, cannot form explicit memories. Finally, there is implicit long term memory, such as how to ride a bicycle or use a pencil. This type of memory does not seem to require the hippocampus as evidenced by the fact that H.M. could become more skilled at certain games that he was taught to play daily even though he professed to never having played the game each time. The implication of the hippocampus for spatial location for humans is more recent. There was the famous study that showed London cab drivers had an enlarged hippocampus compared to controls and neural imaging has now shown something akin to place fields in humans.

While the three new laureates are all excellent scientists and deserving of the prize, this is still another example of how the Nobel prize singles out individuals at the expense of other important contributors. O’Keefe’s coauthor on the 1971 paper, Jonathan Dovstrosky, was not awarded. I’ve also been told that my former colleague at the University of Pittsburgh, Bill Skaggs, was the one who pointed out to the Mosers that the patterns in their data corresponded to grid cells. Bill was one of the most brilliant scientists I have known but did not secure tenure and is not directly involved in academic research anymore as far as I know. The academic system should find a way to maximize the skills of people like Bill and Douglas Prasher.

Finally, the hype surrounding the prize announcement is that the research could be important for treating Alzheimer’s disease, which is associated with a loss of episodic memory and navigational ability. However, if we use the premise that there must be a neural correlate of anything an animal can do, then place cells must necessarily exist given that rats have the ability to discern spatial location. What we did not know was where these cells are and O’Keefe showed us that it is in the hippocampus but we could have also associated the hippocampus with the memory loss of Alzheimer’s disease from H.M. The existence of grid cells is perhaps less obvious since it is not inherently obvious that we can naturally divide a room into a triangular lattice. It is plausible that grid cells do the computation giving rise to place cells but we still need to understand the computation that gives rise to grid cells. It is not obvious to me that grid cells are easier to compute than place cells.

Selection of the week

October 3, 2014

Here is a short snippet from an old BBC show featuring the great English guitarist Julian Bream playing two preludes from Brazilian composer Heitor Villa-Lobos from fifty years ago.

Linear and nonlinear thinking

October 1, 2014

A linear system is one where the whole is precisely the sum of its parts. You can know how different parts will act together by simply knowing how they act in isolation. A nonlinear function lacks this nice property. For example, consider a linear function f(x). It satisfies the property that f(a x + b y) = a f(x) + b f(y). The function of the sum is the sum of the functions. One important point to note is that what is considered to be the paragon of linearity, namely a line on a graph, i.e. f(x) = mx + b is not linear since f(x + y) = m (x + y) + b \ne f(x)+ f(y). The y-intercept b destroys the linearity of the line. A line is instead affine, which is to say a linear function shifted by a constant. A linear differential equation has the form

\frac{dx}{dt} = M x

where x can be in any dimension.  Solutions of a linear differential equation can be multiplied by any constant and added together.

Linearity is thus essential for engineering. If you are designing a bridge then you simply add as many struts as you need to support the predicted load. Electronic circuit design is also linear in the sense that you combine as many logic circuits as you need to achieve your end. Imagine if bridge mechanics were completely nonlinear so that you had no way to predict how a bunch of struts would behave when assembled together. You would then have to test each combination to see how they work. Now, real bridges are not entirely linear but the deviations from pure linearity are mild enough that you can make predictions or have rules of thumb of what will work and what will not.

Chemistry is an example of a system that is highly nonlinear. You can’t know how a compound will act just based on the properties of its components. For example, you can’t simply mix glass and steel together to get a strong and hard transparent material. You need to be clever in coming up with something like gorilla glass used in iPhones. This is why engineering new drugs is so hard. Although organic chemistry is quite sophisticated in its ability to synthesize various compounds there is no systematic way to generate molecules of a given shape or potency. We really don’t know how molecules will behave until we create them. Hence, what is usually done in drug discovery is to screen a large number of molecules against specific targets and hope. I was at a computer-aided drug design Gordon conference a few years ago and you could cut the despair and angst with a knife.

That is not to say that engineering is completely hopeless for nonlinear systems. Most nonlinear systems act linearly if you perturb them gently enough. That is why linear regression is so useful and prevalent. Hence, even though the global climate system is a highly nonlinear system, it probably acts close to linear for small changes. Thus I feel confident that we can predict the increase in temperature for a 5% or 10% change in the concentration of greenhouse gases but much less confident in what will happen if we double or treble them. How linear a system will act depends on how close they are to a critical or bifurcation point. If the climate is very far from a bifurcation then it could act linearly over a large range but if we’re near a bifurcation then who knows what will happen if we cross it.

I think biology is an example of a nonlinear system with a wide linear range. Recent research has found that many complex traits and diseases like height and type 2 diabetes depend on a large number of linearly acting genes (see here). Their genetic effects are additive. Any nonlinear interactions they have with other genes (i.e. epistasis) are tiny. That is not to say that there are no nonlinear interactions between genes. It only suggests that common variations are mostly linear. This makes sense from an engineering and evolutionary perspective. It is hard to do either in a highly nonlinear regime. You need some predictability if you make a small change. If changing an allele had completely different effects depending on what other genes were present then natural selection would be hard pressed to act on it.

However, you also can’t have a perfectly linear system because you can’t make complex things. An exclusive OR logic circuit cannot be constructed without a threshold nonlinearity. Hence, biology and engineering must involve “the linear combination of nonlinear gadgets”. A bridge is the linear combination of highly nonlinear steel struts and cables. A computer is the linear combination of nonlinear logic gates. This occurs at all scales as well. In biology, you have nonlinear molecules forming a linear genetic code. Two nonlinear mitochondria may combine mostly linearly in a cell and two liver cells may combine mostly linearly in a liver.  This effective linearity is why organisms can have a wide range of scales. A mouse liver is thousands of times smaller than a human one but their functions are mostly the same. You also don’t need very many nonlinear gadgets to have extreme complexity. The genes between organisms can be mostly conserved while the phenotypes are widely divergent.

Selection of the week

September 26, 2014

Let’s bring in the fall with Vivaldi’s Autumn from the Four Seasons. Detroit is mostly known for cars and bankruptcy but it also has great culture.  Here is the Detroit Symphony Orchestra with Scottish violinist Nicola Benedetti.

Selection of the week

September 19, 2014

The Chromatic Fantasy and Fugue in D minor, BWV 903 by Johann Sebastian Bach played by Canadian pianist Angela Hewitt, perhaps the best Bach interpreter since the great Glenn Gould.

Julia update

September 16, 2014

I’m totally committed to Julia now.  It is blitzing fast and very similar to Matlab with some small differences (improvements). I particularly like the fact that when you declare an array with one argument, like x = zeros(10), it immediately gives you a vector of length 10 and not a 10 X 10 matrix. The broadcast function is also very useful.  There are still many things I don’t yet know how to do in Julia like how to import and export data.  Plotting is also not fully solved. I’ve been using PyCall to import matplotlib.pyplot, which works pretty well but not perfectly. There are things I miss in Matlab like context dependent history recall, i.e. I can recall an old command line by just typing the first few letters.  If anyone knows how to do this in Julia please let me know.  Right now, I’m hitting the uparrow button continuously until I find the line I want. I do worry that my abandoning Matlab means someone will lose their job, which is certainly not my intention. However, I am well ahead of schedule for zero Matlab.

Selection of the week

September 12, 2014

I think the Meditation from Jules Massenet’s opera Thais is appropriate for the day after September 11.  Here is violinist Sarah Chang.

Unintended consequences

September 10, 2014

Here is a true story. A young man is trained to hit people as hard as possible and to react immediately to any provocation with unhindered aggression. He signs a 5 year contract for 35 million dollars to do this 16 times a year or more if he and his colleagues are very successful at doing it. One day he gets upset with his fiancée and strikes her in the head so hard that she is knocked unconscious in a public place. This creates a minor stir so the employer mandates that he must apologize and is prohibited from smashing into people for 2 of the 16 times he is scheduled to do so. The fiancée-now-spouse also refuses to press charges because she doesn’t want to jeopardize the 27 million over the next 3 years owed to the man. However, a video showing the incident is made public creating a huge uproar so the employer abruptly fires the man and condemns him since he now is no longer financially useful to the employer. The public now feels vindicated that such a despicable man is no longer employed and that domestic violence now is given the attention it deserves. However, the spouse is very unhappy because her comfortable lifestyle has just been pulled from right under her. Now, other spouses who live with violent but rich men will be even more silent about abuse because they fear losing their livelihoods too. If we really cared about victims of domestic violence, we would force the employer to set up a fund to ensure that spouses that come forward are compensated financially. We would also force them to support institutions that help the many more victims of domestic abuse who are not married to rich and famous people. This young man is probably an upstanding citizen most of the time. Now he is unemployed and potentially even angrier. He should not be thrown out onto the street but given a chance to redeem himself. The employers and the system who trained and groomed these young men need to look at themselves.

The morality of watching (American) football

September 7, 2014

The question in this week’s New York Times Ethicist column is whether it is wrong to watch football because of the inherent dangers to the players. The ethicist, Chuck Klosterman, says that it is ethical to watch football because the players made the decision to play freely with full knowledge of the risks. Although I think Klosterman has a valid point and I do not judge anyone who enjoys football, I have personally decided to forgo watching it. I simply could no longer stomach watching player after player going down with serious injuries each week. In Klosterman’s article, he goes on to say that even if football were the only livelihood the players had, we should still watch football so that they could have a livelihood. This is where I disagree. Aside from the fact that we shouldn’t have a society where the only chance to have a decent livelihood is through sports, football need not be that sport. If football did not exist, some other sport, including a modified safer football, would take its place. Soccer is the most popular sport in the rest of the world. Football exists in its current form because the fans support it. If that support moved to another sport, the players would move too.

Selection of the week

September 5, 2014

I think we should start September off with Beethoven.  Here is the first movement of the Sextet in E-flat op.81 b for two horns and strings performed by the Amici Ensemble Frankfurt. Note that they added a double bass, just to make it more special.

Selection of the Week

August 29, 2014

Chopin Nocturne No 8 Op 27 No 2 performed by Maurizio Pollini, whom I’ve had the fortune of seeing in Boston many years ago.

MCMC for linear models

August 25, 2014

I’ve been asked in a comment to give a sample of pseudo code for an MCMC algorithm to fit a linear model ax + b to some data. See here for the original post on MCMC. With a linear model, you can write down the answer in closed form (see here), so it is a good model to test your algorithm and code.  Here it is in pseudo-Julia code:

#  initial guess for parameters a and b 
# construct chi squared, where D is the data vector and x is the vector of the
# independent quantity
chi = norm(D - (a*x +b))^2;
for n = 1 : total;
# Make random guesses for new normally distributed a and b with mean old a and b
# and standard deviation asig and bsig
at = a + asig * randn()
bt = b + bsig * randn() chit = norm(D - (at*x + bt))^2;
# Take ratio of likelihoods, sigma is the data uncertainty
ratio=exp((-chit + chi)/(2*sigma^2));
# Compare the ratio to a uniform random number between 0 and 1, 
# keep new parameters if ratio exceeds random number
if rand() < ratio a = at;
b = bt; chi = chit; end


# Keep running until convergence

Selection of the week

August 22, 2014

George Gerschwin, although generally classified as jazz, was strongly influenced by classical music.  Here is Summertime from his opera Porgy and Bess performed by soprano Renee Fleming.

J. Bryce McLeod, 1929-2014

August 20, 2014

I was given the sad news that J. Bryce McLeod died today in his home in England. Bryce was an extraordinary mathematician and an even better human being. I had the fortune of being his colleague in the math department at the University of Pittsburgh. I will always remember how gracious and welcoming he was when I started. One of the highlights of my career was being invited to a conference in his honour in Oxford in 2001. At the conference dinner, Bryce gave the most perfectly constructed speech I have ever heard. It was just like the way he did mathematics – elegantly and sublimely.

Julia vs Python update

August 19, 2014

We’ve been using Julia for a little over a month now and I am quite pleased thus far. It is fast enough for us to get decent results for our MCMC runs; runs for which Python and Brian were too slow. Of course, we probably could have tried to optimize our other codes but Julia installs right out of the box and is very easy to program in. We still have issues for plotting but do have minimal plot functionality using PyCall and importing matplotlib.pyplot. One can also dump the results to a file and have either Python or some other software make the plots. I already find myself reaching for Julia when I want to write a quick code to solve a problem instead of relying on Matlab like I used to. While I would like to try PyDSTool, the lack of a trivial installation is holding us back. For you budding software developers out there, if it takes more than one click on a link to make it work on my computer, I am likely to go to something else. The reason I switched from Linux to Mac a decade ago was that I wanted Unix capability and the ability to print without having to consult with a sys admin.

Selection of the week

August 15, 2014

Eiene Kleine Nachtmusik by Wolfgang Amadeus Mozart performed by the Concertgebouw Kamerorkest in 2013 in Amsterdam.

Numerical integration

August 14, 2014

Solving differential equations numerically is a very mature field with multiple algorithms carefully explained in multiple textbooks. However, when it comes down to actually solving them in practice by nonspecialists, a lot of people, myself included, will often resort to the Euler method. There are two main reasons for this. The first is that it is trivial to remember and code and the second is that Moore’s law has increased the speed of computing sufficiently that we can get away with it. However, I think it’s time to get out of the eighteenth century and at least move into the nineteenth.

Suppose you have a differential equation \dot{x}=f(x,t), where x can be a scalar or a vector in an arbitrary number of dimensions.  The Euler method simply discretizes the time derivative via

\frac{dx}{dt} \rightarrow \frac{x(t+h)-x(h)}{h}

so that x at time step t+h is given by


Now what could be simpler than that? There are two problems with this algorithm. The first is that it is only accurate to order h so you need to have very small steps to simulate accurately. The second is that the Euler method is unstable if the step size is too large. The way around these limitations is to use very small time steps, which is computationally expensive.

The second most popular algorithm is probably (4th order) Runge-Kutta, which can generally use larger time steps for the same accuracy. However, a higher order algorithm is not necessarily faster because what really slows down a numerical scheme is the number of function evaluations it needs to make and 4th order Runge-Kutta needs to make a lot of function evaluations. Runge-Kutta is also plagued by the same stability problems as Euler and that can also limit the maximum allowed step size.

Now, these problems have been known for decades if not centuries and work-arounds do exist. In fact, the most sophisticated solvers such as CVODE tend to use adaptive multi-step methods, like Adams-Bashforth (AB) or it’s implicit cousin Adams-Moulton and this is what we should all be implementing in our own codes. Below I will give simple pseudo-code to show how a 2nd order Adams-Bashforth scheme is not much more complicated to code than Euler and uses the same number of function evaluations. In short, with not much effort, you can speed your code immensely by simply switching to AB.

The cleverness of AB is that it uses previous steps to refine the estimate for the next step. In this way, you get a higher order scheme without having to re-evaluate the right hand side of your ODE. You simply reuse your computations. It’s the ultimate green scheme. For the same ODE example above 2nd order AB is simply


You just need to store the previous two evaluations to compute the next step.  To start the algorithm, you can just use an Euler algorithm to get the first step.  

Here is the 2nd order AB algorithm written in pseudo Julia code for a one dimensional version of the ODE above.

# make x a two dimensional vector for the two time steps

# Set initial condition


# Store evaluation of function f(x,t) for later use

fstore[1]= f(x[1],h)

# Take Euler step

x[2] = x[1] + h*fstore[1]

# store function at time = 2h

fstore[2]  = f(x[2],2*h)

# Store computed values for output



# Take AB steps

for t = 3:Tfinal

     # set update indices for x based on parity of t

     index2 = t%2+1

     index1 = (t-1)%2+1

     # 2nd order AB step

     x[index1] = x[index2]+h*(1.5*fstore[2]-0.5*fstore[1]))

     # update stored function

     fstore[index1] = f(x[index1],t*h)



return xout

Selection of the Week

August 8, 2014

The 4th movement of Franz Schubert’s Trout Quintet (Piano Quintet in A Major) performed by the Schubert Ensemble.

The neuro software gap

August 4, 2014

Our foray into Julia was mostly motivated by a desire to have a programming language as fast as C but as easy to program as Matlab. In particular, we need really fast looping speed, which is necessary to simulate dynamical systems. Our particular project involves fitting a conductance-based model (i.e. Hodgkin-Huxley-like equations) to some data using MCMC. We’re not trying to fit the precise spike shapes per se but the firing rates of synaptically coupled pools of neurons in response to various inputs. We thus need to run simulations for some time, compare the averaged rates to previous runs, and iterate. For good fits, we may need to do this a million times. Hence, we need speed and we can’t really parallelize because we need information from one iteration before going to the next.

Now, one would think that simulating neural networks is a long solved problem and I thought that too. Hence, I instructed my fellow Wally Xie to look into all the available neural simulators and see what fits the bill. We first tried Neuron but couldn’t get all of it to work on our Linux machine running CentOS (which is what NIH forces us to use).  That was the end of Neuron for us. We next tried Nest but found it difficult to specify the neuron model. We finally settled on Brian, which is adaptable, written in Python, and supported by very nice people, but is turning out to be too slow. Native Python is also too slow. Stan would be great but it does not yet support differential equations. Wally is now implementing the code in Julia, which is turning out to be very fast but has a possible bug that is preventing long runs. The developers are extremely helpful though so we believe this will be worked out soon. However, it is hard to get plotting to work on Julia and as pointed out by many, Julia is not nearly as complete as Python or Matlab. (Yes, Matlab is too slow). We could use C and will go to it next if Julia does not work out. Thus, even though people have been developing neural simulators for decades, there is still an unmet need for something adaptable, fast, easy to use, and runs out of the box. A neuro-Stan would be great.


Get every new post delivered to your Inbox.

Join 121 other followers