UVA Probability Seminar
Wednesdays, 3:30 - 4:30pm
Kerchof 326
Organizers: Christian Gromoll & Tai Melcher
Mathematics Other Seminars Maps & Directions

Spring 2012

25 Jan Organizational meeting
1 Feb Christian Gromoll
A fascinating open problem...
8 Feb Larry Thomas
The polaron and Clark-Ocone formula
15 Feb Patrick Driscoll
Density formulas for stochastic areas
22 Feb Chip Levy
A stochastic model of adaptive-synaptogenesis: Will it construct energy-efficient neural-like computation?
29 Feb Dan Dobbs
What is ... an abstract Wiener space?
7 Mar no seminar (Spring Break)
14 Mar Shankar Bhamidi (UNC)
Limited choice and randomness in evolution of networks
21 Mar Janna Lierl (Cornell)
Estimates for the heat kernel with Dirichlet boundary condition
28 Mar Dan Dobbs
Smoothness results on infinite-dimensional Heisenberg-like groups
4 Apr Linan Chen (McGill)
From additive functions to Wiener maps
11 Apr Malek Abdesselam
Generalized stochastic processes with a quantum field theory flavor
18 Apr Ed Waymire (Oregon State)
Tree polymers under weak and strong disorder
25 Apr Gianluca Guadagni
Extreme value theory
* please note time, date, and/or location change

Abstracts

A stochastic model of adaptive-synaptogenesis: Will it construct energy-efficient neural-like computation?

Chip Levy

The forebrain is a massively parallel analog computer consisting of 1010 computational elements (neurons) with about 1014 connections (synapses), which interface communication between the neurons. Somehow the environment is represented (encoded) by the states or successive states of neurons. As an analog computer, the brain computes via signal flow, i.e., its computations are merely transformations on the encodings. Another characterization of the brain is its extreme energy efficiency. For example, detailed digital simulations of brain-states consume ca. 107 more power than the brain consumes (20 watts = 20 joules/sec).

The problem addressed in this talk concerns the formation of connections between neurons. This process is relatively rapid in young animals but seems to continue throughout life (it is the basis of memory encoding). The 1014 connections in the forebrain far exceed the ability to pre-specify their location via the genetic (DNA) information, and in any event, one would want the connections to reflect information in the environment and the disposition of the organisms sensors.

This talk presents a conjectured, stochastic algorithm that directs the connection-forming process. The key idea that this algorithm solves is the achievement of some pre-set, bits-per-average joule performance by a tier of neurons, where this performance is achieved without ever measuring bits or joules. The algorithm uses an appropriately biased, random process to direct formation of new connections (synaptogenesis) and can discard (shed) connections merely based on neuronally detectable activity-correlations.

top of page

Limited choice and randomness in evolution of networks

Shankar Bhamidi (UNC)

The last few years have seen an explosion in network models describing the evolution of real world networks. In the context of math
probability, one aspect which has seen an intense focus is the interplay between randomness and limited choice in the evolution of networks, ranging from the description of the emergence of the giant component, the new phenomenon of "explosive percolation" and power of two choices. I will describe on going work in understanding such dynamic network models, their connections to classical constructs such as the standard multiplicative coalescent and local weak convergence of random trees.
top of page

Estimates for the heat kernel with Dirichlet boundary condition

Janna Lierl (Cornell)

I will present two-sided Gaussian bounds for the heat kernel corresponding to the Laplacian with a drift in certain Euclidean domains under Dirichlet boundary condition. These bounds will be derived for domains that are inner uniform. Inner uniformity is a condition on the boundary of the domain that is described solely in terms of the intrinsic length metric of the domain. Interesting examples of inner-uniform domains are the interior of the Koch snowflake and the complement of a convex set. In case the domain is bounded, our estimates imply the intrinsic ultracontractivity of the associated semigroup.
top of page

From additive functions to Wiener maps

Linan Chen (McGill)

I will introduce two measure theoretic variations of Cauchy's classical functional equation for additive functions, both in finite and
infinite dimensions. I will examine the linearity and continuity of the solutions (to such generalized Cauchy functional equations) with a
particular emphasis on their very different natures from finite dimensions to infinite dimensions. The observations lead naturally to results about the structure of abstract Wiener spaces. This is joint work with Daniel Stroock.
top of page

Tree polymers under weak and strong disorder

Ed Waymire (Oregon State)

Tree polymers models are simplifications of 1+1 dimensional lattice polymers made up of polygonal paths of a (nonrecombining) binary tree having random path probabilities prescribed by (normalized) multiplicative cascade measures. The a.s. probability laws of these paths are of interest under weak and strong types of disorder. Some recent results, speculation and conjectures will be presented for this class of models under both weak and strong disorder conditions. In particular results are included that suggest an explicit formula for the asymptotic variance of the "free end'' under strong disorder. This is based on joint work with Stanley Williams and Torrey Johnson.
top of page

Mathematics Other Seminars Maps & Directions