I’m currently in Banff, Alberta for a Festschrift for Jack Cowan (webpage here). Jack is one of the founders of theoretical neuroscience and has infused many important ideas into the field. The Wilson-Cowan equations that he and Hugh Wilson developed in the early seventies form a foundation for both modeling neural systems and machine learning. My talk will summarize my work on deriving “generalized Wilson-Cowan equations” that include both neural activity and correlations. The slides can be found here. References and a summary of the work can be found here. All videos of the talks can be found here.
Addendum: 17:44. Some typos in the talk were fixed.
Addendum: 18:25. I just realized I said something silly in my talk. The Legendre transform is an involution because the transform of the transform is the inverse. I said something completely inane instead.