Past Events

Seminars are held Tuesdays and Thursdays in POB 6.304 from 3:30-5:00 pm, unless otherwise noted. Speakers include scientists, researchers, visiting scholars, potential faculty, and ICES/UT Faculty or staff. Everyone is welcome to attend. Refreshments are served at 3:15 pm.

Thursday, Apr 25

  • Additional Information

    Hosted by Omar Ghattas

    Sponsor: ICES Seminar

    Speaker: Hari Sundar

    Speaker Affiliation: Assistant Professor, School of Computing, University of Utah

  • Abstract

    We present a highly scalable framework that targets problems of interest to the numerical relativity and broader astrophysics communities. This framework combines a parallel octree-refined adaptive mesh with a wavelet adaptive multiresolution and a physics module to solve the Einstein equations of general relativity. The goal of this work is to perform advanced, massively parallel numerical simulations of intermediate-mass-ratio inspirals of binary black holes with mass ratios on the order of 100:1. These studies will be used to generate waveforms as used in the data analysis of the Laser Interferometer Gravitational-Wave Observatory and to calibrate semi-analytical approximate methods. Our framework consists of a distributed memory octree-based adaptive meshing framework in conjunction with a sophisticated code generator from symbolic expressions. The code generator makes our code portable across different architectures, including SIMD vectorization, OpenMP, and CUDA combined with efficient distributed memory adaptive data-structures.. The equations corresponding to the target application are written in symbolic notation, and generators for different architectures can be added independently of the application. Additionally, this symbolic interface also makes our code extensible and as such has been designed to easily accommodate many existing algorithms in astrophysics for plasma dynamics and radiation hydrodynamics. Our adaptive meshing algorithms and data structures have been optimized for modern architectures with deep memory hierarchies. This enables our framework to achieve excellent performance and scalability on modern leadership architectures. We demonstrate excellent weak scalability up to 131K cores on the Oak Ridge National Laboratory's Titan for binary mergers for mass ratios up to 100.

    Bio
    Hari Sundar is an Assistant Professor in the School of Computing at the University of Utah. His research focuses on the development of computationally optimal parallel, high-performance algorithms, that are efficient and scalable on state-of-the-art architectures. It is driven by applications in biosciences, geophysics, and computational relativity. His research has resulted in the development of state-of-the-art distributed algorithms for adaptive mesh refinement, geometric multigrid, fast Gauss transform and sorting. He received his Ph.D. from the University of Pennsylvania and was a postdoc at the Institute for Computational Engineering & Sciences at the University of Texas at Austin.


Tuesday, Apr 23

Self-Organization and Mechanics in the Cell -- Different room # POB 4.304

Tuesday, Apr 23, 2019 from 3:30PM to 5PM | POB 4.304

Important Update: NOTE: POB 4.304
  • Additional Information

    Hosted by George Biros

    Sponsor: ICES Seminar

    Speaker: Michael Shelley

    Speaker Affiliation: Professor, Center for Computational Biology, Flatiron Institute & Courant Institute, NYU

  • Abstract

    The inside of a cell is an active place, with molecular machines busy positioning subcellular organelles, organizing themselves within membranes, or remodeling chromatin in the nucleus. I will discuss how mathematical modeling and large-scale simulations have interacted with experimental measurements and perturbations of such motor-driven biomechanical processes within the cell. This includes how the spindle finds its place in the cell, which is best treated as a complex mechanical systems that works with transitory elements, and how motor activity and hydrodynamic interactions may underlie an apparently self-organizing dynamics of chromatin in the nucleus.

    Bio
    Michael Shelley joined the Simons Foundation in 2016 to work on the modeling and simulation of complex systems arising in physics and biology. He is an applied mathematician who co-founded and co-directs the Courant Institute’s Applied Mathematics Laboratory at New York University. Shelley joined the Courant Institute in 1992 and is the Lilian and George Lyttle Professor of Applied Mathematics. He holds a B.A. in mathematics from the University of Colorado and a Ph.D. in applied mathematics from the University of Arizona. He was a postdoctoral researcher at Princeton University and a member of the mathematics faculty at the University of Chicago before joining NYU. Shelley has received the François Frenkiel Award from the American Physical Society and the Julian Cole Lectureship from the Society for Industrial and Applied Mathematics, and he is a Fellow of both societies.


Friday, Apr 19

Isogeometric Analysis-Suitable Geometry

Friday, Apr 19, 2019 from 11AM to 12PM | POB 6.304

  • Additional Information

    Hosted by Max Bremer and William Ruys

    Sponsor: ICES Seminar-Student Forum Series

    Speaker: Kendrick Shepherd

    Speaker Affiliation: Oden Institute, UT Austin

  • Abstract

    The engineering design-through-manufacturing process hinges on accurate representation of geometric data for accurate analyses, design optimization, and computer-aided manufacturing. Unfortunately, 60-80% of the design-through-analysis process is devoted to geometry rebuilding in the form of meshing. The current meshing process approximates intended geometries, poorly represents the spectra of underlying geometries, limits feedback between analyses and designs, and involves extensive manual interaction. In this work, it is proposed that computer-aided design (CAD) geometries are rebuilt using CAD-suitable representations generated by frame-fields, discrete Ricci flow, and Strebel differentials. Inspired from the fields of geometry and topology, these tools will generate high-quality, geometry-fitting, feature-aligning, globally semi-structured quadrilateral representations with high-continuity both automatically and with potential for user interaction. Unlike current meshing methods, the resulting geometries will be simultaneously suitable for engineering design, engineering analysis, and engineering manufacturing.


Thursday, Apr 18

Numerical simulations of the global oceanic internal tide and gravity wave fields

Thursday, Apr 18, 2019 from 3:30PM to 5PM | POB 6.304

  • Additional Information

    Hosted by Patrick Heimbach

    Sponsor: ICES Seminar

    Speaker: Brian Arbic

    Speaker Affiliation: Associate Professor, Department of Earth and Environmental Sciences, University of Michigan, Ann Arbor, MI

  • Abstract

    We discuss numerical simulations of oceanic internal gravity waves (IGWs) on a global scale, on US Navy, NASA, and European high performance computing platforms. IGWs are waves that exist on the interfaces between oceanic layers of different densities. IGWs of tidal frequency are known as internal tides. Beyond tidal frequencies, there is a spectrum of IGWs known as the IGW continuum. The rollover and breaking of IGWs controls most of the mixing in the open-ocean beneath the mixed layer. IGWs also impact the speed of sound, and yield a measurable sea surface height (SSH) signal. Therefore IGWs are important for satellite altimetry missions, including the upcoming Surface Water and Ocean Topography (SWOT) mission, and for operational oceanography in general. We describe our work with the US Navy HYbrid Coordinate Ocean Model (HYCOM), in which we pioneered high-resolution global ocean models simultaneously forced by atmospheric fields and the astronomical tidal potential. We also examine newer simulations performed under similar conditions, on NASA supercomputers, with the Massachusetts Institute of Technology general circulation model (MITgcm). Finally, we briefly describe related work done with the European ocean forecasting model, the Nucleus for European Modeling of the Oceans (NEMO). We summarize several papers on comparison of the modeled internal tides and the IGW continuum spectrum to altimetry and observations from moorings. We briefly discuss the generation of the continuum spectrum and the potential implications for a better understanding of ocean mixing.


Wednesday, Apr 17

Radiomic machine learning for precision breast cancer prevention

Wednesday, Apr 17, 2019 from 3PM to 4PM | POB 6.304

  • Additional Information

    Hosted by Robert Moser

    Sponsor: Faculty Candidate Seminar: Joint Oden Institute/BME

    Speaker: Aimilia Gastounioti

    Speaker Affiliation: Research Associate, Department of Radiology, Computational Breast Imaging Group, Center for Biomedical Image Computing and Analytics, University of Pennsylvania

  • Abstract

    Breast cancer risk assessment has become increasingly important for forming tailored breast cancer screening and prevention strategies. An emerging approach to evaluate breast cancer risk profiles more accurately and help better guide personalized patient care is the incorporation of computational imaging phenotypes. In this talk, I will discuss novel computational approaches that leverage radiomic machine learning in breast cancer risk estimation from various complementary viewpoints. First, I will describe a systematic comparative study on large-population data involving the two types of images generated from digital mammography towards investigating potential differences in quantitative measurements which would have subsequent implications in related interpretation. I will then present a novel computational framework which allows breast anatomy to drive breast imaging phenotyping of breast cancer risk. Third, I will discuss the use of the cutting-edge deep learning technologies to better capture breast parenchymal complexity patterns which are associated with breast cancer risk.

    Bio
    Aimilia Gastounioti PhD, is a Research Associate in the Department of Radiology at the University of Pennsylvania (UPenn), where she is a member of the Computational Breast Imaging Group (CBIG) in the Center for Biomedical Image Computing and Analytics (CBICA). Aimilia received her Ph.D. in Biomedical Engineering from the National Technical University of Athens in 2014, part of which was performed in collaboration with the Ecole Centrale de Paris. She has also received her Clinical Research Certificate from UPenn for training in clinical epidemiology, biostatistics and translational research. Her research interests focus on translational biomedical imaging research with a primary focus on radiomic machine learning related to cancer risk-prediction, diagnosis and prognosis. She has co-authored 18 journal articles, 2 book chapters, 25 conference proceedings papers, as well as 16 abstracts in premier scientific meetings. She has been leading a 3-year Susan G. Komen foundation fellowship as a PI and she has also served as co-PI in two seed grants funded by internal sources at UPenn. Her research work has been awarded by the Engineering in Medicine & Biology (EMB) Greece Chapter (2014) and has also been included in Research Highlights of the IEEE Journal of Biomedical Health Informatics (2015), the SPIE Medical Imaging Conference (2017) and the 30th Anniversary AACR Special Conference on Convergence: Artificial Intelligence, Big Data, and Prediction in Cancer (2018). Aimilia is an Associate member of the American Association for Cancer Research (AACR) and the Institute for Translational Medicine and Therapeutics (ITMAT) at the University of Pennsylvania.


Tuesday, Apr 16

Incorporating Structure into Machine Learning - TIME Changed

Tuesday, Apr 16, 2019 from 9:30AM to 10:30AM | POB 6.304

Important Update: PLEASE Note: Change in TIME
  • Additional Information

    Hosted by Ufuk Topcu

    Sponsor: ICES Seminar

    Speaker: Osbert Bastani

    Speaker Affiliation: Research assistant professor, University of Pennsylvania.

  • Abstract

    While modern deep learning algorithms have proven to be surprisingly powerful, their unstructured nature has posed challenges that are difficult to overcome. I will talk about two projects aiming to address some of these challenges. First, safety is a major obstacle to using controllers based on reinforcement learning on real robots. I will describe how to use structured models---in particular, decision trees---to enable safe reinforcement learning. Second, deep generative models have difficulty capturing global structure in images such as repetitions and symmetries. I will describe how to incorporate programmatic structure into these models to capture global structure.

    Bio
    Osbert Bastani is a research assistant professor at the University of Pennsylvania. He is broadly interested in research at the intersection of machine learning and programming languages, and is currently working on trustworthy machine learning.


  • Additional Information

    Hosted by Robert Moser

    Sponsor: Faculty Candidate Seminar: Joint Oden Institute/BME

    Speaker: Noelia Grande Gutiérrez

    Speaker Affiliation: Research Assistant, PhD Candidate, Mechanical Engineering, Stanford University

  • Abstract

    Computational methods are emerging as valuable tools to support clinical-decision making, risk assessment, and device design in cardiovascular medicine. In this talk, I will apply computational simulations to investigate the implications of coronary artery aneurysms hemodynamics on thrombus formation in children with Kawasaki disease, the most common cause of acquired heart disease in children. First, I will discuss novel image processing methods that use transluminal attenuation gradient (TAG) to extract functional information from CT scans. I demonstrate significantly abnormal TAG values in aneurysms caused by Kawasaki disease compared to normal coronary arteries.

    Second, I will present an image-based computational framework combining a deep understanding of coronary physiology with advanced numerical methods and high performance computing, to obtain fully resolved patient-specific hemodynamic data relevant to thrombotic risk stratification. Simulations are performed with finite element methods incorporating fluid structure interaction and closed loop lumped parameter models to represent vascular boundary conditions. The primary translational goal is to support clinical decisions about when and if a patient needs to start systemic anti-coagulation therapy. My results demonstrate that hemodynamic variables such as wall shear stress and residence time are significantly more predictive of thrombotic risk than the anatomical measurements currently used in clinical practice.

    Finally, I will discuss biochemical aspects of thrombus initiation and how these processes can be modeled using a continuum approach. I will present a new model based on scalar transport and incorporating velocity fields from patient-specific simulations to track activation and accumulation of platelets and other blood components critical to the coagulation cascade. This model provides a new approach to investigate thrombus initiation from a patient-specific perspective, which could help identify regions at higher risk of thrombosis as well as strategies for thrombosis prevention.

    Bio
    Noelia Grande Gutiérrez is a PhD Candidate in Mechanical Engineering at Stanford University. She obtained a M.S. in Engineering Sciences from the University of California, San Diego, a M.S in Biomedical Engineering from the University of Barcelona and her B.S. in Aerospace Engineering from the Technical University of Madrid. Her research interests lie at the intersection of computational engineering and cardiovascular medicine, and involve the development and application of multi-physics models that contribute to support clinical-decision making and provide novel insight into cardiovascular disease. For her doctoral research, she received the American Heart Association fellowship. Under the supervision of Dr. Alison Marsden, she has worked on applying patient-specific computational simulations to understand the role of hemodynamics on coronary artery aneurysms thrombosis.


Monday, Apr 15

Taming nonconvexity: from smooth to nonsmooth problems and beyond

Monday, Apr 15, 2019 from 11AM to 12PM | POB 6.304

  • Additional Information

    Hosted by Robert Moser

    Sponsor: Faculty Candidate Seminar: Joint Oden Institute/ECE

    Speaker: Ju Sun

    Speaker Affiliation: Postdoctoral Scholar, Stanford University

  • Abstract

    Most applied problems we encounter can be naturally written as nonconvex optimization, for which obtaining a local minimizer is computationally hard in theory, never mind the global minimizer. In practice, however, simple numerical methods often work surprisingly well in finding high-quality solutions, e.g., training deep neural networks.

    In this talk, I will describe our recent effort in bridging the mysterious theory-practice gap for nonconvex optimization, in the context of solving practical problems in signal processing, machine learning, and scientific imaging. 1) I will highlight a family of smooth nonconvex problems that can be solved to global optimality using simple numerical methods, independent of initialization. 2) The discovery, however, does not cover nonsmooth functions, which are frequently used to encode structural objects (e.g., sparsity) or achieve robustness. I will introduce tools from nonsmooth analysis, and demonstrate how nonsmooth, nonconvex problems can also be analyzed and solved in a provable manner. 3) Toward the end, I will provide examples to show how innovative problem formulation and physical design can help to tame nonconvexity.

    Bio
    Ju Sun is a postdoctoral scholar at Stanford University, working with Professor Emmanuel Candѐs. Prior to this, he received his Ph.D. degree from Electrical Engineering of Columbia University in 2016 (2011--2016) and B.Eng. degree in Computer Engineering (with a minor in Mathematics) from the National University of Singapore in 2008 (2004--2008). His research interests span computer vision, machine learning, numerical optimization, signal/image processing, and high-dimensional data analysis. Recently, he is particularly fascinated by why simple numerical methods often solve nonconvex problems surprisingly well (on which he maintains a bibliographic webpage: http://sunju.org/research/nonconvex/ ) and the implication on representation learning. He won the best student paper award from SPARS'15 and honorable mention of doctoral thesis for the New World Mathematics Awards (NWMA) 2017.


Friday, Apr 12

  • Additional Information

    Hosted by Tom O'Leary-Roseberry

    Sponsor: ICES Seminar-Babuska Forum Series

    Speaker: Yuanxun Bao

    Speaker Affiliation: Postdoctoral Fellow, Oden Institute, UT Austin

  • Abstract

    Recent advances in colloidal synthesis allow the control of particle shape and open new opportunities to explore colloidal self-assembly at a higher level of complexity. Numerical simulations are particularly suitable for exploring effects of shape asymmetry on the collective behavior of colloidal systems without fabricating the particles in the first place. In chemical engineering, the methods of Brownian and Stokesian Dynamics are popular but limited to spherical particles only. In this talk I will present new progresses in the numerical methods for simulating colloidal suspensions of complex rigid particles beyond spheres. I will present fast methods that address the following challenges: (1) incorporating the long-ranged many-body hydrodynamic interactions in the presence of viscous fluid; (2) consistently generating the hydrodynamically correlated Brownian motion; (3) achieving (near-) linear scaling as the number of particles increases.

    In the first part of the talk, I will present a relatively low-accuracy but flexible and simple rigid multiblob method that is based on the immersed boundary method and Stokes equation. Experimentally-relevant configurations such as no-slip walls and confinement can be implemented as boundary conditions in the fluid solver. Thermal fluctuations and thus Brownian motion can be consistently modeled by including a stochastic stress in the momentum equation, as dictated by fluctuating hydrodynamics. In the second part, I will present a resolved and high-accuracy fluctuating boundary integral method for Brownian suspensions. For both methods, I will focus on developing (near) linear-scaling algorithms for generating both the deterministic and stochastic (Brownian) contributions of the particle velocities.

    Bio
    Dr. Yuanxun Bao received his Ph.D. in Mathematics from the Courant Institute of Mathematical Sciences, New York University in 2018. His research interests are in the areas of multiscale computational modeling of fluids and materials science. In particular, he is interested in developing computational methods for studying physical and biological systems where stochastic effects are important. His current work at ICES is under the supervision of Professor George Biros, and focuses on the simulation, optimization and design under uncertainty in materials science.


Friday, Apr 12

A Scalable Heuristic for Fastest-Path Computation on Very Large Road Maps

Friday, Apr 12, 2019 from 2PM to 3PM | POB 2.402 (Electronic)

Important Update: NOTE: Different location
  • Additional Information

    Hosted by Chandrajit Bajaj

    Sponsor: ICES Seminar

    Speaker: Craig Gotsman

    Speaker Affiliation: Dean, Ying Wu College of Computing and Distinguished Professor, New Jersey Institute of Technology

  • Abstract

    This talk will present a simple but very effective improvement to a variant of the classical shortest-path algorithm, a cornerstone in computer science.

    The fastest-path (or minimal travel time) query between two points in a very large road map is an increasingly important primitive in modern transportation and navigation systems, thus very efficient computation of these paths on detailed road maps under dynamic traffic conditions is critical for system performance and throughput. We present a method to compute an effective admissible heuristic for the fastest path travel time between two points on a road map, which can be used to significantly accelerate the classical A algorithm when computing fastest paths in road maps. Our method is based on two hierarchical sets of separators of the map represented by two binary trees. A preprocessing step computes a short vector of values per road junction based on the separator trees, which is then stored with the map and used to efficiently compute the heuristic at the online query stage. We demonstrate experimentally that this method scales well to maps at the continental level, providing a better quality heuristic, thus more efficient A search, for fastest path queries between points at all distances, relative to other known heuristics.

    Joint work with Renjie Chen - Max-Planck Institute for Informatics, Saarbrücken, Germany.

    Bio
    Craig Gotsman is a Distinguished Professor of Computer Science at the New Jersey Institute of Technology (NJIT) and the Dean of NJIT’s Ying Wu College of Computing since Jan 2017. Prior to that he was a co-founder of the Cornell Tech campus in New York City and Professor and Founding Director of the Jacobs Technion-Cornell Innovation Institute there. Before Cornell Tech, Gotsman held the Hewlett-Packard Chair in Computer Engineering at Technion – Israel Institute of Technology, where he was a faculty member for 20 years. He also served as Deputy Senior Vice President of the Technion. He received his Ph.D. in Computer Science from the Hebrew University of Jerusalem in 1991.

    Gotsman, who co-founded the Technion Center for Graphics and Geometric Computing, is active in research on 3D computer graphics, geometric modeling, animation and computational geometry. He has been a visiting professor at Harvard University, ETH Zurich and INRIA Sophia Antipolis, and a research scientist at MIT. He has published 160 papers in the professional literature, and served on the editorial boards and on the program committees of the most important publication venues in computer graphics, including ACM SIGGRAPH, ACM TOG and IEEE TVCG. Gotsman is a Fellow of the National Academy of Inventors and a Fellow of the Academy of Europe.

    An active entrepreneur, Gotsman holds eleven U.S. patents, and founded three startup companies. The most recent one, Perceptiko, founded in 2014 and acquired in 2017, commercialized his academic research and developed real-time video processing technology using the output of a depth camera to enhance the video conferencing experience. Gotsman has also consulted for numerous Fortune 100 companies including Hewlett-Packard, Intel, Nokia, Samsung, Shell Oil and Disney.