MCMC for linear models

August 25, 2014

I’ve been asked in a comment to give a sample of pseudo code for an MCMC algorithm to fit a linear model ax + b to some data. See here for the original post on MCMC. With a linear model, you can write down the answer in closed form (see here), so it is a good model to test your algorithm and code.  Here it is in pseudo-Julia code:

#  initial guess for parameters a and b 
a=0
b=0
# construct chi squared, where D is the data vector and x is the vector of the
# independent quantity
chi = norm(D - (a*x +b))^2;
for n = 1 : total;
# Make random guesses for new normally distributed a and b with mean old a and b
# and standard deviation asig and bsig
at = a + asig * randn()
bt = b + bsig * randn() chit = norm(D - (at*x + bt))^2;
# Take ratio of likelihoods, sigma is the data uncertainty
ratio=exp((-chit + chi)/(2*sigma^2));
# Compare the ratio to a uniform random number between 0 and 1, 
# keep new parameters if ratio exceeds random number
if rand() < ratio a = at;
b = bt; chi = chit; end

end

# Keep running until convergence

Selection of the week

August 22, 2014

George Gerschwin, although generally classified as jazz, was strongly influenced by classical music.  Here is Summertime from his opera Porgy and Bess performed by soprano Renee Fleming.

J. Bryce McLeod, 1929-2014

August 20, 2014

I was given the sad news that J. Bryce McLeod died today in his home in England. Bryce was an extraordinary mathematician and an even better human being. I had the fortune of being his colleague in the math department at the University of Pittsburgh. I will always remember how gracious and welcoming he was when I started. One of the highlights of my career was being invited to a conference in his honour in Oxford in 2001. At the conference dinner, Bryce gave the most perfectly constructed speech I have ever heard. It was just like the way he did mathematics – elegantly and sublimely.

Julia vs Python update

August 19, 2014

We’ve been using Julia for a little over a month now and I am quite pleased thus far. It is fast enough for us to get decent results for our MCMC runs; runs for which Python and Brian were too slow. Of course, we probably could have tried to optimize our other codes but Julia installs right out of the box and is very easy to program in. We still have issues for plotting but do have minimal plot functionality using PyCall and importing matplotlib.pyplot. One can also dump the results to a file and have either Python or some other software make the plots. I already find myself reaching for Julia when I want to write a quick code to solve a problem instead of relying on Matlab like I used to. While I would like to try PyDSTool, the lack of a trivial installation is holding us back. For you budding software developers out there, if it takes more than one click on a link to make it work on my computer, I am likely to go to something else. The reason I switched from Linux to Mac a decade ago was that I wanted Unix capability and the ability to print without having to consult with a sys admin.

Selection of the week

August 15, 2014

Eiene Kleine Nachtmusik by Wolfgang Amadeus Mozart performed by the Concertgebouw Kamerorkest in 2013 in Amsterdam.

Numerical integration

August 14, 2014

Solving differential equations numerically is a very mature field with multiple algorithms carefully explained in multiple textbooks. However, when it comes down to actually solving them in practice by nonspecialists, a lot of people, myself included, will often resort to the Euler method. There are two main reasons for this. The first is that it is trivial to remember and code and the second is that Moore’s law has increased the speed of computing sufficiently that we can get away with it. However, I think it’s time to get out of the eighteenth century and at least move into the nineteenth.

Suppose you have a differential equation \dot{x}=f(x,t), where x can be a scalar or a vector in an arbitrary number of dimensions.  The Euler method simply discretizes the time derivative via

\frac{dx}{dt} \rightarrow \frac{x(t+h)-x(h)}{h}

so that x at time step t+h is given by

x(t+h)=x(t)+h*f(x(t),t)

Now what could be simpler than that? There are two problems with this algorithm. The first is that it is only accurate to order h so you need to have very small steps to simulate accurately. The second is that the Euler method is unstable if the step size is too large. The way around these limitations is to use very small time steps, which is computationally expensive.

The second most popular algorithm is probably (4th order) Runge-Kutta, which can generally use larger time steps for the same accuracy. However, a higher order algorithm is not necessarily faster because what really slows down a numerical scheme is the number of function evaluations it needs to make and 4th order Runge-Kutta needs to make a lot of function evaluations. Runge-Kutta is also plagued by the same stability problems as Euler and that can also limit the maximum allowed step size.

Now, these problems have been known for decades if not centuries and work-arounds do exist. In fact, the most sophisticated solvers such as CVODE tend to use adaptive multi-step methods, like Adams-Bashforth (AB) or it’s implicit cousin Adams-Moulton and this is what we should all be implementing in our own codes. Below I will give simple pseudo-code to show how a 2nd order Adams-Bashforth scheme is not much more complicated to code than Euler and uses the same number of function evaluations. In short, with not much effort, you can speed your code immensely by simply switching to AB.

The cleverness of AB is that it uses previous steps to refine the estimate for the next step. In this way, you get a higher order scheme without having to re-evaluate the right hand side of your ODE. You simply reuse your computations. It’s the ultimate green scheme. For the same ODE example above 2nd order AB is simply

x(t+2h)=x(t+h)+h\left(\frac{3}{2}f(x(t+h),t+h)-\frac{1}{2}f(x(t),t)\right)

You just need to store the previous two evaluations to compute the next step.  To start the algorithm, you can just use an Euler algorithm to get the first step.  

Here is the 2nd order AB algorithm written in pseudo Julia code for a one dimensional version of the ODE above.

# make x a two dimensional vector for the two time steps

# Set initial condition

x[1]=0;

# Store evaluation of function f(x,t) for later use

fstore[1]= f(x[1],h)

# Take Euler step

x[2] = x[1] + h*fstore[1]

# store function at time = 2h

fstore[2]  = f(x[2],2*h)

# Store computed values for output

xout[1]=x[1]

xout[2]=x[2]

# Take AB steps

for t = 3:Tfinal

     # set update indices for x based on parity of t

     index2 = t%2+1

     index1 = (t-1)%2+1

     # 2nd order AB step

     x[index1] = x[index2]+h*(1.5*fstore[2]-0.5*fstore[1]))

     # update stored function

     fstore[index1] = f(x[index1],t*h)

     xout[t]=x[index1]

end

return xout

Selection of the Week

August 8, 2014

The 4th movement of Franz Schubert’s Trout Quintet (Piano Quintet in A Major) performed by the Schubert Ensemble.

The neuro software gap

August 4, 2014

Our foray into Julia was mostly motivated by a desire to have a programming language as fast as C but as easy to program as Matlab. In particular, we need really fast looping speed, which is necessary to simulate dynamical systems. Our particular project involves fitting a conductance-based model (i.e. Hodgkin-Huxley-like equations) to some data using MCMC. We’re not trying to fit the precise spike shapes per se but the firing rates of synaptically coupled pools of neurons in response to various inputs. We thus need to run simulations for some time, compare the averaged rates to previous runs, and iterate. For good fits, we may need to do this a million times. Hence, we need speed and we can’t really parallelize because we need information from one iteration before going to the next.

Now, one would think that simulating neural networks is a long solved problem and I thought that too. Hence, I instructed my fellow Wally Xie to look into all the available neural simulators and see what fits the bill. We first tried Neuron but couldn’t get all of it to work on our Linux machine running CentOS (which is what NIH forces us to use).  That was the end of Neuron for us. We next tried Nest but found it difficult to specify the neuron model. We finally settled on Brian, which is adaptable, written in Python, and supported by very nice people, but is turning out to be too slow. Native Python is also too slow. Stan would be great but it does not yet support differential equations. Wally is now implementing the code in Julia, which is turning out to be very fast but has a possible bug that is preventing long runs. The developers are extremely helpful though so we believe this will be worked out soon. However, it is hard to get plotting to work on Julia and as pointed out by many, Julia is not nearly as complete as Python or Matlab. (Yes, Matlab is too slow). We could use C and will go to it next if Julia does not work out. Thus, even though people have been developing neural simulators for decades, there is still an unmet need for something adaptable, fast, easy to use, and runs out of the box. A neuro-Stan would be great.

Selection of the Week

August 1, 2014

Summer from Antonio Vivaldi’s Four Seasons of course!  Here is a performance by the 2nd place winner at the 2012 Menuhin Junior violin competition.

 

JSMB/SMB talk

July 29, 2014

I just gave an invited plenary talk at the joint annual meeting of the Japanese Society of Mathematical Biology and Society of Mathematical Biology in Osaka Japan.  I talked about my work on steroid-regulated gene transcription.  The slides are here.  Previous posts on the topic, including background summaries, can be found here.

Tim’s Vermeer

July 28, 2014

Jan Vermeer has been one of my favourite painters ever since I saw his famous “The Girl with the Pearl Earring” painting that was on display in Toronto in the 1980’s. I’ve been on a quest to see all of his paintings although its been on hiatus for the past ten years. Here is the list of what I’ve seen so far (I have five left). You only need to stand in front of a Vermeer for a few seconds to be mesmerized. I stood in front of “The Music Lesson” in Buckingham Palace for at least an hour. The guard started joking with me because I was so transfixed. This is why I’ve been intrigued by recent suggestions by artist David Hockney and others that some great old masters like Vermeer and van Eyck may have used optical aids like camera obscura. Well, inventor Tim Jenison has taken this theory to another level by attempting to completely recreate Vermeer’s Music Lesson using a set up of mirrors and lenses that he (re)invented. The endeavor is documented in the film Tim’s Vermeer directed by Teller of Penn and Teller fame. Whether you believe the theory or not (I actually do and it doesn’t detract at all for my love of Vermeer), what this film does do so well is to show what dedication, thought, patience, and careful execution can accomplish. I got tired just watching him paint the threads in a Persian rug using his optical tool.

Selection of the week

July 23, 2014

What is called classical music mostly refers to the Western symphony orchestra tradition that starts in the seventeenth century with Vivaldi and peaks in the early twentieth with Mahler. While classical music remains popular, my unscientific sampling of concert hall audiences indicates that the demographic skews to retirement age and above. I don’t know if this means that a generation of music lovers is about to depart or that people only have the patience to sit through a long concert when they are older. In an attempt to introduce a new generation to classical music, I thought I would present a selection each week. And what’s a better way to kick it off then with the pseudo-Baroque precursor to heavy metal, the Praeludium and Allegro by Fritz Kreisler. Kreisler performed in the first half of the twentieth century. He was one of the greatest violin virtuosos of all time and also wrote some great violin ditties. Here is a performance by the then 13 year old Canadian/American violinist Leila Josefowicz in 1991.

Talk at Jackfest

July 16, 2014

I’m currently in Banff, Alberta for a Festschrift for Jack Cowan (webpage here). Jack is one of the founders of theoretical neuroscience and has infused many important ideas into the field. The Wilson-Cowan equations that he and Hugh Wilson developed in the early seventies form a foundation for both modeling neural systems and machine learning. My talk will summarize my work on deriving “generalized Wilson-Cowan equations” that include both neural activity and correlations. The slides can be found here. References and a summary of the work can be found here. All videos of the talks can be found here.

 

Addendum: 17:44. Some typos in the talk were fixed.

Addendum: 18:25. I just realized I said something silly in my talk.  The Legendre transform is an involution because the transform of the transform is the inverse. I said something completely inane instead.

Zero Matlab 2015

July 7, 2014

So I have bitten the bullet and am committed to phasing out Matlab completely by 2015.  I have Julia installed and can sort of run code in it although I have no idea how to change directories.  cd() doesn’t seem to work for me.  I have also tried to install matplotlib on my MacBook Pro running OS 10.8 using a SciPy superpack but it does not seem to work yet.  When I try to plot something, nothing happens but I have some bouncing rocket ships in my dock.  Feel free to let me know what I’m doing wrong.

 

Addendum: I installed iPython using Pip and I can plot out of iPython

The need for speed

July 1, 2014

Thanks for all the comments about the attributes of Python and Julia. It seems to me that the most prudent choice is to learn Python and Julia. However, what I would really like to know is just how fast these languages really are and here is the test. What I want to do is to fit networks of coupled ODEs (and  PDEs) to data using MCMC (see here). This means I need a language that loops fast. An example in pseudo-Matlab code would be

for n = 1:N

for i = 1:T

y(i+1) = M\y(i)

end

Compare to data and set new parameters

end

where h is a parameter and M is some matrix (say 1000 dimensional), which is sometimes a Toeplitz matrix but not always. Hence, in each time step I need to invert a matrix, which can depend on time so I can’t always precompute, and do a matrix multiplication. Then in each parameter setting step I need to sum an objective function like the mean square error over all the data points. The code to do this in C or Fortran can be pretty complicated because you have to keep track of all the indices and call linear algebra libraries. I thus want something that has the simple syntax of Matlab but is as fast as C. Python seems to be too slow for our needs but maybe we haven’t optimized the code. Julia seems like the perfect fit but let me know if I am just deluded.

Julia vs Python

June 29, 2014

I was about to start my trek up Python mountain until Bard Ermentrout tipped me to the Julia language and I saw this speed table from here (lower is faster):

Fortran Julia Python R Matlab Octave Mathe-matica JavaScript Go
gcc 4.8.1 0.2 2.7.3 3.0.2 R2012a 3.6.4 8.0 V8 3.7.12.22 go1
fib 0.26 0.91 30.37 411.36 1992.00 3211.81 64.46 2.18 1.03
parse_int 5.03 1.60 13.95 59.40 1463.16 7109.85 29.54 2.43 4.79
quicksort 1.11 1.14 31.98 524.29 101.84 1132.04 35.74 3.51 1.25
mandel 0.86 0.85 14.19 106.97 64.58 316.95 6.07 3.49 2.36
pi_sum 0.80 1.00 16.33 15.42 1.29 237.41 1.32 0.84 1.41
rand_mat_stat 0.64 1.66 13.52 10.84 6.61 14.98 4.52 3.28 8.12
rand_mat_mul 0.96 1.01 3.41 3.98 1.10 3.41 1.16 14.60 8.51

Julia is a dynamic high level language like MATLAB and Python that is open source and developed at MIT. The syntax looks fairly simple and it is about as fast as C (Fortran looks like it still is the Ferrari of scientific computing). Matlab is fast for vector and matrix operations but deadly slow for loops. I had no idea that Mathematica was so fast. Although Julia is still relatively new and not nearly as expansive as Python, should I drop Python for Julia?

The MATLAB handcuff

June 26, 2014

The first computer language I learned was BASIC back in the stone age, which led directly to Fortran. These are procedural languages that allow the infamous GOTO statement, now shunned by the computer literati. Programming with the GOTO gives you an appreciation for why the Halting problem is undecidable.  Much of what I did in those days was to track down infinite loops. I was introduced to structured programming in university, where I learned Pascal. I didn’t really know what structured programming meant except that I no longer could use GOTO and there were data structures like records. I was forced to use APL at a summer job. I have little recollection of the language except that it was extremely terse and symbolic. It was fun to try to construct the shortest program possible to do the task. The ultimate program was the so-called “APL one liner”. APL gave me first hand experience of the noncomputability of Kolmogorov complexity. In graduate school I went back to Fortran, which was the default language to do scientific computing at that time. I also used the computer algebra system called Macsyma, which was much better than Mathematica. I used it to do Taylor expansions and perturbation theory. I was introduced to C and C++ in my first postdoc. That was an eye-opening experience as I never really understood how a computer worked until I programmed in C. Pointer arithmetic was a revelation. I now had such control and power. C++ was the opposite of C for me. Object oriented programming takes you very far away from the workings of a computer. I basically programmed exclusively in C for a decade – just C and XPP, which was a real game changer. I had no need for anything else until I got to NIH. It was only then that I finally sat down and programmed in MATLAB. I had resisted up to that point and still feel like it is cheating but I now almost do all of my programming in MATLAB, with a smattering of R and XPP of course. I’m also biased against MATLAB because it gave a wrong answer in a previous version. At first, I programmed in MATLAB as I would in C or Fortran but when it came down to writing the codes to estimate heritability directly from GWAS (see here), the matrix manipulating capabilities of MATLAB really became useful. I also learned that statistics is basically applied linear algebra. Now, when I code I think instinctively in matrix terms and it is very hard for me to go back to programming in C. (Although I did learn Objective C recently to write an iPhone App to predict body weight. But that was mostly point-and-click and programming by trial and error. The App does work though (download it here). I did that because I wanted to get a sense of what real programmers actually do.) My goal is to switch from MATLAB to Python and not rely on proprietary software. I encourage my fellows to use Python instead of MATLAB because it will be a cinch to learn MATLAB later if they already know Python. The really big barrier for me for all languages is to learn the ancillary stuff like what do you actually type to run programs, how does Python know where programs are, how do you read in data, how do you plot graphs, etc? In MATLAB, I just click on an icon and everything is there. I keep saying that I will uncuff myself from MATLAB one day and maybe this is the year that I actually do.

New Papers

June 16, 2014
Two new papers are now in print:
The first is on applying compressed sensing to genomics is now published in Gigascience. The summary of the paper is here and the link is here.
The second is on steroid-regulated gene induction and can be obtained here.
Biochemistry. 2014 Mar 25;53(11):1753-67. doi: 10.1021/bi5000178. Epub 2014 Mar 11.

A kinase-independent activity of Cdk9 modulates glucocorticoid receptor-mediated gene induction.

Abstract

A gene induction competition assay has recently uncovered new inhibitory activities of two transcriptional cofactors, NELF-A and NELF-B, in glucocorticoid-regulated transactivation. NELF-A and -B are also components of the NELF complex, which participates in RNA polymerase II pausing shortly after the initiation of gene transcription. We therefore asked if cofactors (Cdk9 and ELL) best known to affect paused polymerase could reverse the effects of NELF-A and -B. Unexpectedly, Cdk9 and ELL augmented, rather than prevented, the effects of NELF-A and -B. Furthermore, Cdk9 actions are not blocked either by Ckd9 inhibitors (DRB or flavopiridol) or by two Cdk9 mutants defective in kinase activity. The mode and site of action of NELF-A and -B mutants with an altered NELF domain are similarly affected by wild-type and kinase-dead Cdk9. We conclude that Cdk9 is a new modulator of GR action, that Ckd9 and ELL have novel activities in GR-regulated gene expression, that NELF-A and -B can act separately from the NELF complex, and that Cdk9 possesses activities that are independent of Cdk9 kinase activity. Finally, the competition assay has succeeded in ordering the site of action of several cofactors of GR transactivation. Extension of this methodology should be helpful in determining the site and mode of action of numerous additional cofactors and in reducing unwanted side effects.

PMID: 24559102 [PubMed - indexed for MEDLINE]
PMCID: PMC3985961 [Available on 2015/2/21]

Slides for talk on gene expression

June 10, 2014

Here are the slides for my talk today at the NIH Systems Biology Forum on gene expression.  Background for the talk can be found here.

Marc Andreesen on EconTalk

June 3, 2014

If you have any interest in technology and the internet then you should definitely listen to this EconTalk podcast with Marc Andreesen, who wrote the first web browser Mosaic that led to the explosive growth of the internet. He has plenty of insightful things to say.  I remember first seeing Mosaic in 1994 as a postdoc in Boulder, Colorado. There I was, doing research that involved programming in C and C++. I was not really happy with what I was doing. I was having a hard time finding the next job. I was one of the first to play around with HTML, and it never occurred to me once that I could pack my bags, move to Silicon Valley, and try to get involved in the burgeoning tech revolution. It just makes me wonder what other obvious things I’m missing right now.

Addendum, 2014-6-5:  Actually, it may have been 1993 that I first saw Mosaic.


Follow

Get every new post delivered to your Inbox.

Join 115 other followers