2010-11 Distinguished Lecture Series

Each year the Department of Electrical and Computer Engineering, in conjunction with the Information Infrastructure Institute (iCUBE), presents a Distinguished Lecture Series, which brings prominent researchers in the electrical, computer, and software engineering fields to campus. All lectures take place from 1:10 to 2 p.m. at the Alliant Energy-Lee Liu Auditorium in Howe Hall.

Videos from the 2010-11 Distinguished Lecture Series are available to ISU students, faculty, and staff for educational purposes only. To obtain access to the videos, e-mail the department’s Communications Specialist or stop by 2215 Coover Hall.

Chih-Ming Ho

Chih-Ming Ho

September 16


Dr. Chih-Ming Ho, Ben Rich – Lockheed Martin Professor and Director of the Center for Cell Control, Henry Samueli School of Engineering and Applied Science, University of California, Los Angeles

Abstract: A complex system is composed of a large number of interacting building blocks/elements that self-organize, generating emerging properties that are usually not directly linked to those of the individual building elements. A cell is the most fundamental biological system and yet is a complex system.

In each living cell, the interactions among the bio molecules, proteins, and nucleic acids intrinsically serve as the foundation of the extensive networks of signal and regulatory pathways. Emergent cellular functionalities are derived from the self-organization of these pathways, but cannot be related easily to individual bio-molecular interactions. It becomes obvious that exploring and understanding the cell functions based on the bottom-up reductionist approach present significant challenges due to the sheer magnitude of pathway processes and pathway crosstalk. Furthermore, we frequently intend to direct cellular phenotypic and genotypic outcomes toward a desired state with a key example being the application of pharmacological agents to treat diseased cells in medicine. In other words, the drug application is an expedition to manipulate the cell fate by stimulating a far from understanding network.

Rather than laboriously mapping out the detailed cascade of signaling pathways from the bottom up, we take a top-down approach by employing a feedback system control (FSC) scheme to bypass the challenges associated with simultaneously considering multiple cellular regulatory pathways in cellular complex systems. In addition, we have harnessed these control schemes to rationally design combinatorial drug therapy modalities to direct the cellular system output with improved efficacy and low toxicity. This imposes another challenge that pertains to the large parameter space. For example, six drugs with 10 concentrations each would result in 1 million potential search trials. With the feedback system optimization approach, we have demonstrated that only tens of searches instead of 1 million cases are needed to identify the optimized drug cocktail. This work is supported by the NIH Nanomedicine Roadmap Program.

Speaker biography: Chih-Ming Ho is director of the Center for Cell Control, an NIH Nanomedicine Roadmap Center at the University of California, Los Angeles. His research offers an engineering approach to determining optimal drug cocktails for the treatment of cancer and infectious diseases. The feedback system control scheme requires only tens of iterations to identify the most effective combinatorial drugs from millions of possibilities. Ho, a member of the National Academy of Engineering and an Academician of Academia Sinica, is known for his work in micro/nano fluidics, bio-nano technologies and turbulence. He was ranked by ISI as one of the top 250 most cited researchers worldwide in the entire engineering category and is also a fellow of the American Physical Society and American Institute of Aeronautics and Astronautics for his contributions in a wide spectrum of technology areas. He holds the Ben Rich-Lockheed Martin Professorship in the School of Engineering and Applied Science at UCLA and received his PhD from Johns Hopkins University.

*Lecture cosponsopred by Committee on Lectures (funded by GSB)

Virgil D. Gligor

Virgil D. Gligor

October 8


Virgil D. Gligor, Co-director of CyLab and Professor of Electrical and Computer Engineering, Carnegie Mellon University

Abstract: Few of the system architectures for security proposed for the past four decades (e.g., fine-grain domains of protection, virtual machines) have made a significant difference on client-side security. In this presentation, I examine some of the reasons for this and some of the lessons learned to date. Focus on client-side security is warranted primarily because it is substantially more difficult to achieve than server security in practice, since clients interact with human users directly. I argue that system and application partitioning to meet user security needs is now feasible, and that special focus must be placed on how to design and implement trustworthy communication, not merely cryptographically secure channels, between system partitions.

Two forms of partitioning system and network components are described. The first, which is inspired by Lampson’s Red/Green separation idea, partitions system resources instead of “virtualizing” them, and switches between partitions only under (human) user control exercised via a trusted path. Neither operating systems nor applications can escape their partition or transfer control to other partitions behind the user’s back as a consequence of malware or insider attacks. The second form of partitioning separates programmer-selected, security-sensitive code blocks from untrusted operating system code, applications and devices, and provides strong guarantees of data secrecy and integrity, as well as execution integrity, to an external entity via attestation.  Trustworthy communication among partitions goes beyond secure channels, firewalls, guards, and filters. The extent to which one partition accepts input from or outputs to another depends on the accountability of, and trust established with, the input provider and output receiver. It also depends on input-rate throttling and output propagation control, which often require establishing some degree of control over remote communication end points. Several fundamental problems of trustworthy communication are outlined and to underscore the need for additional research in this area.

Speaker biography: Virgil D. Gligor received his B.Sc., M.Sc., and Ph.D. degrees from the University of California, Berkeley. He taught at the University of Maryland between 1976 and 2007, and is currently a professor of electrical and computer engineering at Carnegie Mellon University and co-director of CyLab. Over the past 35 years, his research interests ranged from access control mechanisms, penetration analysis, and denial-of-service protection to cryptographic protocols and applied cryptography. Gligor was an editorial board member of several journals and the editor-in-chief of the IEEE Transactions on Dependable and Secure Computing. He served as the chair of ACM’s Special Interest Group on Security, Audit, and Control, and received the 2006 National Information Systems Security Award jointly given by National Institute of Standards and Technology and National Security Agency in the United States.

Gordon W. Roberts

Gordon W. Roberts

November 19


Gordon W. Roberts, James McGill Chair in Electrical and Computer Engineering, McGill University (Canada)

Abstract: This talk will describe the fundamentals of time-based data converters that are finding widespread use in electronic applications from data converters, filters, and frequency synthesizers to PLLs. The fundamental element of the time-based method is the delay element, from which all other building blocks are constructed. We shall begin by describing the principles of time-based signal processing techniques, which is based on time-difference variables involving step-like digital signals. Time-mode signal processing is a promising candidate to replace conventional voltage-mode methods, or at the very least, augment traditional analog signal processing functions. Time-mode circuits are constructed largely from digital blocks like inverters, thereby requiring very low silicon area. Moreover, time-mode circuits can be calibrated by digitally controlling the number of inverters in cascade.

Speaker biography: Gordon W. Roberts is a full professor at McGill University in Montreal and holds the James McGill Chair in Electrical and Computer Engineering. He has co-written five textbooks related to analog IC design and mixed-signal test. Roberts has supervised over 40 graduate students at the master’s and PhD levels. In 2003 he took leave from McGill to co-found DFT Microsystems, Inc., a company specializing in high-speed timing measurement. He returned to McGill in 2005 where his current research includes analog IC design methods and built-in self-test techniques for analog and high-speed digital circuits. Roberts is a Fellow of the IEEE.

*Gordon W. Roberts photo courtesy of McGill University

Edward A. Lee

Edward A. Lee

March 11


Edward A. Lee, Robert S. Pepper Distinguished Professor, Department of Electrical Engineering and Computer Science, University of California, Berkeley

Abstract: This talk argues that cyber-physical systems present a substantial intellectual challenge that requires changes in both theories of computation and dynamical systems theory. The CPS problem is not the union of cyber and physical problems, but rather their intersection, and as such it demands models that embrace both. Two complementary approaches are identified: cyberizing the physical (CtP) means to endow physical subsystems with cyber-like abstractions and interfaces; and physicalizing the cyber (PtC) means to endow software and network components with abstractions and interfaces that represent their dynamics in time.

Speaker biography: Edward A. Lee is the Robert S. Pepper Distinguished Professor and former chair of the Electrical Engineering and Computer Sciences (EECS) department at the University of California, Berkeley. His research interests center on design, modeling, and simulation of embedded, real-time computational systems. He is a director of Chess, the Berkeley Center for Hybrid and Embedded Software Systems, and is the director of the Berkeley Ptolemy project. Lee also is coauthor of five books and numerous papers. He has led the development of several influential open-source software packages, notably Ptolemy and its various spinoffs. He received his bachelor’s degree from Yale University (1979), master’s degree from MIT (1981), and PhD from University of California, Berkeley (1986). From 1979 to 1982 he was a member of technical staff at Bell Telephone Laboratories in Holmdel, New Jersey, in the Advanced Data Communications Laboratory. He is a cofounder of BDTI, Inc., where he is currently a Senior Technical Advisor, and has consulted for a number of other companies. He is a fellow of the IEEE, was an NSF Presidential Young Investigator, and won the 1997 Frederick Emmons Terman Award for Engineering Education.

Leroy Hood

Leroy Hood

April 5


Leroy Hood, President and Co-founder, Institute for Systems Biology

Abstract: The Human Genome Project has catalyzed fundamental changes in the practice of biology and medicine. One of these has been to generate a genetics parts list that includes all human genes (and by inference, all human proteins). Analysis of the information from the Human Genome Project has also catalyzed the view that “biology is an informational science.” Together, these advances have promoted the idea of systems biology—the view that biology can only be understood through an analysis of the information processing of biological machines and networks.

The challenge for biology in the 21st century is the need to deal with its incredible complexity. The Human Genome Project has catalyzed fundamental changes in the practice of biology and medicine. One of these has been to generate a genetics parts list that includes all human genes (and by inference, all human proteins). Analysis of the information from the Human Genome Project also has catalyzed the view that “biology is an informational science.” Together, these advances have promoted the idea of systems biology—the view that biology can only be understood through an analysis of the information processing of biological machines and networks. This informational view of biology leads to the conclusion that biological information is captured, mined, and integrated by biological networks, which ultimately pass it off to molecular machines that execute biological functions. Hence the challenge in understanding biological complexity is that of deciphering the operation of dynamic biological networks across the three time scales of life—evolution, development, and physiological responses. Systems approaches to biology are focused on delineating and deciphering dynamic biological networks. Systems approaches to disease will delineate disease-perturbed biological networks and understand how they encode the pathophysiology. I will outline the contemporary state of systems biology and then focus on its application to disease. In particular, I will discuss in detail a model system we have studied—prion disease in mice. This systems approach provides a powerful new approach to understanding disease mechanisms—and suggests new strategies for diagnosis and therapy. I will discuss in some detail our systems approach to blood diagnostics. Then I will then focus on a series of emerging technologies that will transform the landscape of medicine—next generation DNA sequencing, new approaches to protein analysis, single cell analyses and the powerful new applications of molecular imaging techniques. The systems view of disease has catalyzed revolutionary changes in how to think about diagnosis, therapy and even prevention.

Fueled by dramatic changes in in vitro and in vivo measurement technologies, systems medicine is pushing toward a revolution in health care—one that over the next five to 20 years will lead away from the current reactive medicine (wait until one gets sick before treating) to a medicine that is predictive, preventive, personalized, and participatory (P4). Increasingly, the focus will be on wellness rather than disease. P4 medicine will require billions of measurements on each patient and the means of reducing these measurements to coherent hypotheses about the health and disease of individual patients. How the acquisition, storage, mining, integration (of different data types), modeling, and eventually distribution of the data and the resulting inferences, will occur will be one of the grand challenges of this future in health care. Security and access will be critical considerations.

This P4 medicine will catalyze fundamental changes in virtually every aspect of the health care system and it will require rethinking the educational requirements for physicians. It will lead to the digitalization of medicine with changes even more profound than the digitalization of information technologies and communications. It also will lead to a turn around in the inexorably increasing health care costs—with the possibility of bringing developed world medicine to the developing world. Medicine will truly become an informational discipline—with the enormous potential for individuals to take an active role in helping to guide their future health choices. The digitalization of medicine will transform the computational requirements of health care in ways that we can only begin to imagine.

Speaker biography: Leroy Hood’s research focuses on fundamental biology (immunity, evolution, genomics) and on bringing engineering to biology through the development of five instruments; the DNA and protein sequencers and synthesizers and the ink-jet oligonucleotide synthesizer (making DNA arrays) for deciphering the various types of biological information (DNA, RNA, proteins, and systems). In particular, the DNA sequencer has revolutionized genomics by allowing the rapid automated sequencing of DNA, which played a crucial role in contributing to the successful mapping of the human genome during the 1990s and early 2000s. These instruments constitute the technological foundation for modern molecular biology and genomics. He has applied these technologies to diverse fields including immunology, neurobiology, cancer biology, molecular evolution, and systems medicine. Early in his career, he applied these technologies to the study of molecular immunology (and discovered many of the fundamental mechanisms for antibody diversity) and neurobiology (he cured the first neurological disease by gene transfer in mice).

Hood is now pioneering the idea that the systems approach to disease, the emerging technologies, and powerful new computational and mathematical tools will move medicine from its current reactive mode to a predictive, preventive, personalized, and participatory mode over the next five to 20 years.

He has received 17 honorary degrees from institutions such as Johns Hopkins University, Yale University, UCLA, and Whitman College. He has published more than 680 peer-reviewed papers, received 26 patents, and has coauthored textbooks in biochemistry, immunology, molecular biology, and genetics, and is just finishing a textbook on systems biology. In addition, he coauthored with Dan Keveles a popular book on the human genome project—The Code of Codes.

Hood is one of only seven (of more than 6,000 members) scientists elected to all three academies: the National Academy of Sciences, National Academy of Engineering, and Institute of Medicine.  He also has played a role in founding more than 14 biotechnology companies, including Amgen, Applied Biosystems, Systemix, Darwin, and Rosetta. He is currently pioneering systems medicine and the systems approach to disease and has recently cofounded the company Integrated Diagnostics—that hopefully will become a platform company for P4 medicine.

Hood has a PhD from the California Institute of Technology and MD from Johns Hopkins School of Medicine.

*Lecture cosponsopred by Committee on Lectures (funded by GSB)