- About Us
- Computer Forum
Dear Computer Science Alumni and Friends,
The Stanford Department of Computer Science has had another wonderful year. The department continues to be ranked at the top of all computer science departments nationally and internationally. Our faculty are deeply involved in teaching and conduct research on a broad range of topics at the forefront of computing. We continue to attract the best and brightest students from around the world.
This newsletter covers the highlights of the 2007-2008 academic year-the arrival of new faculty, the promotions of existing faculty, a brief glimpse at some of our exciting research activities, and the celebration of awards and honors bestowed upon department members.
This has been an outstanding year for faculty recruiting, and I am pleased to welcome three new faculty members: John Ousterhout, Jeffrey Heer, and Jure Leskovec.
John Ousterhout is returning to the academic world after 14 years in industry at Sun Microsystems, Scriptics, Interwoven, and Electric Cloud. He joins the department as a professor (research). John is one of the top computer systems researchers worldwide and is known for building elegant and influential systems, including operating systems (Medusa and Sprite), CAD tools (Magic and Crystal), and scripting languages (Tcl/Tk). These projects created communities that are still active today. John is a member of the National Academy of Engineering and recipient of the 1987 Grace Murray Hopper Award of the Association for Computing Machinery. He was on the faculty at the University of California, Berkeley, from 1980 to 1994.
Jeff Heer, from UC Berkeley, joins the department as an assistant professor. Jeff conducts research in human-computer interaction, with an emphasis on interactive visualization and social computing. He built sense.us, a web application for analysis of 150 years of U.S. Census data that allows users to work as a group in analyzing the data by commenting on and annotating visualizations. Jeff is also known for creating prefuse, an open-source toolkit for authoring information applications. It has been downloaded over 40,000 times and is widely used in industry and academia and by hobbyists.
Jure Leskovec, from Carnegie Mellon University, also joins the department as an assistant professor. Jure conducts research in applied machine learning and large-scale data mining, focusing on analysis and modeling of large real-world networks (i.e., the Web, and social and technological networks). Studying how networks evolve over time, he found data that overturned the conventional view of network evolution. He also has studied the dynamics of network behavior, to create a model of how a person's decisions are affected by the quality, frequency, and timing of recommendations from others. Jure has made significant contributions in developing novel algorithms for networks. He collaborated on a project with several CMU faculty on sensor placement for outbreak detection. This work provided algorithms with provable guarantees for placing sensors in networks, and the algorithms work in different types of environments.
The department had two faculty promotions this year. Dan Boneh was promoted to the rank of full professor, and Serafim Batzoglou was promoted to the rank of associate professor with tenure.
Dan Boneh has made numerous contributions to the field of cryptography and computer security. His career highlights include black box algorithms (a computational abstraction that showed deep relevance to the security of cryptographic protocols, mainly Diffie-Hellman); collusion secure fingerprinting, which uses coding theory to achieve resistance against a collusion of users having different versions of the same fingerprinted document; and identity-based encryption using bi-linear maps. The bi-linear map approach has revolutionized cryptography by introducing the first practical solution to the problem of identity-based encryption; that is, using your e-mail address as your public key. Dan's more recent focus has gradually shifted to systems security, a very relevant problem in the Internet age. He has conducted research on improved implementation of network security mechanisms (SSL), anti-phishing tools (SpoofGuard and PwdHash), voting systems, digital rights management, and password protection.
Serafim Batzoglou conducts research in computational genomics. The broad goal of his research is to develop efficient and accurate methodologies for the analysis of genomic data. He has made significant progress on important problems in five areas: whole genome assembly, sequence alignment, network alignment, RNA secondary structure prediction, and gene structure prediction. His software tools are considered "brand names" and are widely used in the field.
A tremendous amount of exciting research is going on in the department. Space permits me to describe only a few of the current projects. To read more about CS research at Stanford, please visit the department website at www-cs.stanford.edu.
In May Stanford announced the formation of the Pervasive Parallelism Lab (PPL). The PPL is addressing the critical problem of programming multicore processors. Parallel programming is an old problem that has eluded a general solution for the last four decades. However, the rise of multicore processors has made the solution to this problem critical to the continued scaling of computing performance and cost that has fueled the information revolution. The PPL is a collaboration between Stanford faculty in Computer Science (Alex Aiken, Bill Dally, Ron Fedkiw, Pat Hanrahan, John Hennessy, Mark Horowitz, Vladlen Koltun, Christos Kozyrakis, Kunle Olukotun, Mendel Rosenblum, and Sebastian Thrun) and partner companies in computer systems (Sun Microsystems, Advanced Micro Devices, NVIDIA, IBM, Hewlett-Packard, and Intel). The partner companies are providing $6 million over three years to fund the new lab.
The goal of the PPL is to make parallel application development possible without the need for explicit parallel programming. The key ideas that are being explored to achieve this goal are domain-specific languages, a common parallel runtime, and new hardware primitives for parallelism. Driving parallel applications such as virtual worlds and robotic automobiles will focus the PPL research by exposing real problems of parallel programming and execution. Domain-specific languages are very high-level languages that are tailored to solving specific kinds of problems. They enable the average programmer to quickly write efficient parallel applications and offer greater parallel optimization opportunities. The common parallel runtime will provide the best combination of static (compile time) and dynamic (runtime) management of the mapping between the DSL programs and the parallel hardware. The new parallel hardware primitives will reduce the overhead of communication, synchronization, and isolation to provide high performance, low power, resilience, and security.
Rajeev Motwani's RAIN (Research on Algorithms for the Internet) project is laying the foundations of the emerging area of Internet algorithmics. This project involves the development of algorithmic techniques for solving emerging problems in the area of web/social/collaborative computing. This includes applications such as web search, online advertising, web mining, social networks, reputation systems, recommendation systems, and collaborative filtering, as well as data privacy issues in this context. Of particular interest is the solution of hard computational and data analysis problems involving massive data sets using efficient algorithms that run in linear or even sublinear time (typically, via randomized techniques). Some of the early work in this area by Rajeev and his collaborators were contributions to the development of techniques such as anchor-text analysis and the PageRank algorithm, some of the key technology components for Google.
In recent years, Rajeev and his students have had a particular interest in measuring and modeling the Web. Their recent work has provided algorithms for measuring the size of the (indexable) Web and created a model for the evolution of the structure of the Web. Another strong interest is in the auction process underlying the pricing of online advertising. They devised an auction methodology for pricing advertisement slots on a web page that is guaranteed to be truthful, removing any incentive for advertisers to underbid (or overbid). They proved formally that this new mechanism is revenue maximizing and a common generalization of the schemes currently being used by Google and Yahoo. Another recent accomplishment is the development of fast, simple, and spam-resistant recommendation systems. They also have helped lay out the formal foundations of the emerging area of data privacy, including some recent initial work on privacy in social networks. Ongoing work includes novel mechanisms for advertising and marketing within the context of social networks, including the modeling and analysis of the viral diffusion of ideas within such networks.
Last year the department initiated a seed funding program to advance the department's strategic plan by providing start-up funds for faculty research in our strategic focus areas that are unlikely to be funded by conventional sources. In this second year of the program the department seed funded four exciting new efforts.
Christos Kozyrakis, Philip Levis, and Nick McKeown are deploying a large-scale power measurement network in Gates Hall. In addition to documenting and monitoring energy consumption of end-user computer systems, servers, and clusters, the network will measure the computing network infrastructure's routers and switches. The long-term goals of the network are to understand how a complex, medium-scale computing network spends its energy, determine how these expenditures are tied to use, provide a testbed for research into energy-efficient computing, and help the department reduce its energy use. The network will publicly provide real-time data on the Gates building's current and historical energy consumption through a web interface, generating useful data sets for the larger research community.
In addition to using commodity, off-the-shelf power meters, the project is designing and building a wireless power measurement sensor. Besides measuring current draw, the sensor can monitor environmental conditions relevant to network administrators, such as temperature, humidity, and dust. Developing a new sensing platform gives the project a great deal of data collection flexibility over commodity approaches. For example, the wireless sensor will be able to measure current draw traces at up to 14kHz, while commodity parts typically provide only 1Hz. Also, by having the sensors self-organize into an ad-hoc data delivery mesh network, deploying new instrumentation points throughout the building will be much easier than with wired sensors.
Andrew Ng and Kunle Olukotun have started a project to develop brain-like computer architectures. The brain is the world's most capable processor, but modern-day computers are ill suited to perform brain-like computations on a large scale; this presents a significant barrier to advances in artificial intelligence, where small-scale brain simulations have already proved very useful for machine-learning applications, such as in computer vision. However, it is currently infeasible to scale these algorithms or to experiment with large-scale brain models.
Motivated by the thesis that much of the human brain (neocortex) may be implementing a single learning algorithm, Ng and Olukotun are developing computer hardware with brain-like organizations, in which numerous lightweight but programmable computational units (CFneurons) are massively connected to 104-105 other computational units. They plan to use this hardware to develop improved brain-like machine-learning algorithms and better neuroscience models of the brain. If one can elucidate or find some approximation to the brain's learning algorithm and apply it on a large scale, significant progress toward human-level artificial intelligence may then be possible. By allowing researchers to experiment with large-scale brain models, Ng and Olukotun hope to enable progress toward this long-term goal.
Social websites such as Wikipedia, Facebook, del.icio.us, Flickr, MySpace, and Orkut have gone from being a small niche of the Web to one of its most important components. In these sites, a community of users contribute their resources, which can be photos, personal information, evaluations, votes, answers to questions, or annotations on web pages.
To study a number of research questions about such sites,Hector Garcia-Molina and Scott Klemmer have started a new project on social networks and web usability analytics. The project is based on CourseRank, a course evaluation and recommendation system for the Stanford University community, built by students in our InfoLab (Filip Kaliszan, Henry Liou, Benjamin Bercovitz, Robert Ikeda, Mike Krieger, and Georgia Koutrika). As of May 2008, CourseRank has over 6,200 users and over 134,000 course evaluations. CourseRank both provides a useful service to the Stanford community and serves as a laboratory for research on social networks.
Daphne Koller and her collaborator at the Stanford School of Medicine, neonatologist Anna Penn (Department of Pediatrics-Neonatology), are planning to analyze data obtained from online monitoring of vital signs of extremely premature infants in a neonatal intensive care unit at the Lucile Packard Children's Hospital. These tiny infants are born 8 to 17 weeks prematurely, and weigh between 400 and 1500 grams; they are susceptible to a wide range of complications that can induce both short-term catastrophic events and long-term adverse outcomes.
Currently, the care of these infants is monitored by an array of devices that measure dozens of physical attributes---blood pressure, heart rate, and more. Physicians, nurses, and other health care providers monitor these data on a moment-by-moment basis to watch for catastrophic events and look at general trends that indicate adverse outcomes. However, even these experts find it difficult to recognize complex patterns across all of the available measurements, which are high-dimensional, very noisy, and change rapidly. Koller and Penn aim to develop new computational modeling and machine-learning algorithms that can automatically monitor and analyze the streams of data measured for patients in an intensive care unit, with the goal of combining these noisy low-level signals for early detection, possibly allowing improved patient outcome by early application of standard therapies. An even more tantalizing hope is that their method will enable them to recognize previously unidentified causative relationships and design new interventions.
This past year we completed the most sweeping changes of our undergraduate Computer Science program since the major was created in the mid-1980s. The Curriculum Committee, including Jerry Cain, Bill Dally, Vladlen Koltun, Phil Levis, John Mitchell, Andrew Ng, Nick Parlante, Eric Roberts, Mendel Rosenblum, Mehran Sahami (Chair), Claire Stager, and Julie Zelenski, led the effort to reinvigorate the curriculum by highlighting the variety of areas that Computer Science now encompasses. The results of this effort, culminating over a year's work that directly involved the vast majority of the faculty in the department, will be going into effect in the coming academic year. The new program now combines a solid set of core courses covering foundational concepts in computing (including programming and systems development as well as theoretical computer science and mathematics), followed by a set of "tracks" (concentration areas) that students may choose from. The initial set of track concentrations include artificial intelligence, computer systems, theoretical computer science, human-computer interaction, graphics, information systems, and biocomputation. Additionally, students have the option to select an "unspecialized" track that replaces depth with further breadth, or to choose to design their own area of concentration. We anticipate adding additional tracks on an ongoing basis. The tracks have been designed both to give students the ability to pursue an area of interest in more depth as well as to provide a number of multi-disciplinary course options to reflect the importance of computing in a wide variety of domains.
Awards and Honors
The team of Sebastian Thrun and Mike Montemerlo took second place in the latest DARPA Grand Challenge. This third competition, the "Urban Challenge," took place on November 3, 2007, at the site of the now-closed George Air Force Base (currently used as Southern California Logistics Airport) in Victorville, California. The fully autonomous vehicles raced to complete a 60-mile urban area course in less than 6 hours. Rules included obeying all traffic regulations while negotiating with other traffic and obstacles and merging into traffic. The Stanford team's 2006 Volkswagen Passat completed the course in just under 4-1/2 hours and averaged 13.7 mph.
Russ Altman has been named as a Fellow of the American Institute for Medical and Biological Engineering (AIMBE). AIMBE Fellows are recognized around the world for their writings (including both books and contributions to technical and academic journals) and their presentations at academic and industry conferences.
Gil Bejerano has been dually honored with a Sloan Research Fellowship from the Alfred P. Sloan Foundation and as a Searle Scholar. The Sloan Foundation awards 118 fellowships annually to enhance the careers of the very best young faculty members in specified fields of science. The Searle Scholars Program awards grants to selected universities and research centers to support the independent research of outstanding individuals in the biomedical sciences and chemistry.
Ron Fedkiw, along with two collaborators at Industrial Light & Magic, received a Scientific and Technical Academy Award in February from the Academy of Motion Picture Arts and Sciences for work on fluid simulation. This well-deserved award recognizes Rons leadership in developing the methods used to simulate fluids and smoke in many feature films.
Ed Feigenbaum, Monica Lam, Marc Levoy, Rajeev Motwani, and Eric Roberts have been named as Fellows of the Association for Computing Machinery in recognition of their achievements in computer science and information technology and for their significant contributions to the mission of the ACM. Among the 38 inductees for 2007, they were honored as follows:
Ed Feigenbaum: For contributions to artificial intelligence
Monica Lam: For contributions to compilers and program analysis
Marc Levoy: For contributions to computer graphics
Rajeev Motwani: For contributions to algorithms and complexity theory
Eric Roberts: For contributions to computer science education
Leo Guibas received the ACM/AAAI Allen Newell Award for pioneering work in computational geometry, with profound applications across an astonishingly broad range of computer science disciplines. It is presented to an individual selected for career contributions that have breadth within computer science, or that bridge computer science and other disciplines.
Mark Horowitz has been elected a Fellow of the American Academy of Arts and Sciences. The Academy is an independent policy research center that undertakes studies of complex and emerging problems. Current Academy research focuses on science and global security, social policy, the humanities and culture, and education.
Scott Klemmer has received a Sloan Research Fellowship. The Sloan awards are intended to enhance the careers of the very best young faculty members in specified fields of science. Currently a total of 118 fellowships are awarded annually in seven fields: chemistry, computational and evolutionary molecular biology, computer science, economics, mathematics, neuroscience, and physics.
Daphne Koller was awarded the ACM---Infosys Foundation Award in the Computing Sciences for her work on combining relational logic and probability that allows probabilistic reasoning to be applied to a wide range of applications, including robotics, economics, and biology. This award recognizes personal contributions by young scientists and system developers to a contemporary innovation that, through its depth, fundamental impact, and broad implications, exemplifies the greatest achievements in the discipline.
Phil Levis has been named a Microsoft Research New Faculty Fellow, one of five up-and-coming faculty members chosen from universities throughout North America. The fellowship program identifies, recognizes, and supports exceptional new faculty members engaged in innovative computing research. Phil researches software and networking for tiny, low-power, wireless sensors. He focuses on making these networks of sensors easier to deploy and maintain by researching ultra-simple algorithms that use robust local rules to achieve desirable global behaviors. Software he develops is used by hundreds of research groups worldwide and runs on millions of nodes.
Subhasish Mitra has received the ACM SIGDA (Special Interest Group on Design Automation) Outstanding New Faculty Award, which recognizes a junior faculty member early in her or his academic career who demonstrates outstanding potential as an educator and/or researcher in the field of electronic design automation.
Kunle Olukotun has been named an IEEE Fellow for contributions to multiprocessors on a chip and multi-threaded processor design. Fellowship recognizes unusual distinction in the profession with an extraordinary record of accomplishments in any of the IEEE fields of interest. The accomplishments have contributed importantly to the advancement or application of engineering, science, and technology, where they have brought significant value to society.
Yoav Shoham received the ACM SIGART (Special Interest Group on Artificial Intelligence) Autonomous Agents Research Award. This award acknowledges the contributions of outstanding researchers in the field of autonomous agents and is granted each year to one individual whose work is influencing and setting the direction for the field.
Chuan Sheng "CS" Foo has received a 2008 Computing Research Association Outstanding Undergraduate Award. The organization's awards program recognizes undergraduate students in North American universities who show outstanding research potential in an area of computing research. Loren Yu received an Honorable Mention for this award.
Lawson Wong was this year's winner of the Wegbreit Prize for Best Honors Thesis in the Department of Computer Science and also a winner of a University Firestone Medal for Excellence in Undergraduate Research. His thesis, "Robotic Grasping on the Stanford Artificial Intelligence Robot," develops a probabilistic model that helps a robot arm to grasp new objects with greater stability. His honors thesis advisor is Andrew Ng.
Under the direction of Jerry Cain, Stanford once again hosted the ACM ICPC Regional Programming Contest for Northern California and Nevada. Stanford's teams placed 3rd, 7th, and 9th out of 77 teams in the regional contest. Stanford's top-ranking team, comprising,Chen Gu, Andy Nguyen, and Meng-Hsuan Wu, advanced to the World Finals in Banff, Canada. At the World Finals, the team placed 7th out of 100 teams (placing 2nd among teams from North America). The team solved 7 out of 11 problems at the competition, and was on the verge of solving an 8th problem, which would have matched the performance of the winning team. We are proud of all their work.
With best regards,
William J. Dally
William R. and Inez Kerr Bell Professor
Chair, Department of Computer Science