Thursday, May 13 | 1:00 PM - 3:00 PM
Analog computing is a promising and practical candidate for solving complex computational problems involving algebraic and differential equations. At the fundamental level, an analog computing framework can be viewed as a dynamical system that evolves following fundamental physical principles, like energy minimization, to solve a computing task. Additionally, conservation laws, such as conservation of charge, energy, or mass, provide a natural way to couple and constrain spatially separated variables. Taking a cue from these observations, in my thesis, I have explored a novel dynamical systems-based computing framework that exploits naturally occurring analog conservation constraints to solve a variety of optimization and learning tasks. The model is based on a special class of multiplicative update algorithms called growth transforms and can be applied to both real and complex domains. Additionally, this computational model naturally satisfies conservation constraints for reaching the minimum energy state with respect to a system-level cost function and is generic enough to be applied to different computing and application paradigms.
First, I will demonstrate how a real domain version of the framework can be used to develop a continuous-time annealing algorithm for solving non-convex and discrete global optimization problems. I will also show how a discrete variant of the model can be used for implementing decentralized optimization algorithms like winner-take-all and ranking. Next, I will present an extension of the dynamical system model to the complex domain and show how it can be used for designing a novel energy-based learning model. The formulation associates both “active” and “reactive” energy metrics with the model, in contrast to traditional energy-based learning models that adopt a single energy metric. The proposed framework ensures that the network’s reactive energy is conserved while dissipating energy only during learning and exploits the phenomenon of electrical resonance for storing the learned parameters. Finally, I will present how a variant of the complex domain generalization can be used for data “sonification” to detect anomalies/novelties in high-dimensional temporally varying data using audio signatures. The algorithm takes as input the data and optimization parameters underlying the learning or prediction task and combines it with the psychoacoustic parameters defined by the user. As a result, the proposed framework outputs audio signatures that not only encode some statistical properties of the high-dimensional data but also reveal the underlying complexity of the optimization/learning process. In summary, I have developed a generalized dynamical system-based analog computing framework that can take in different types of conservation constraints to solve a variety of learning and optimization tasks in the steady-state.
If you are a member of the WashU community, login with your WUSTL Key to interact with events, personalize your calendar, and get recommendations.Login with WUSTL Key
If you are not a member of the WashU community, please login via one of the options below to interact with our calendar.