Natalie Ahn, Julie Black, Jonathan Effrat

As the processor speed of personal computers increases (this is a given for computing technology according to “Moore’s law”), a growing number of CPU cycles are wasted as personal computers sit idly waiting for users to request that a given task be executed.  Distributed computing is the practice by which a host or client computer or system can utilize these currently wasted CPU cycles.  Software that executes certain discrete tasks is sent via a network (local or wide area) such as the Internet to the client personal computers.  Once installed on the client computers, this software allows the host computer to send packets of information to individual PCs to be processed; the results are returned to the host server, tabulated, and compared to results computed by other clients.  Using this method (which is explained in more detail in What Is It?), large data sets can be processed and analyzed in a reasonable amount of time despite a large an extremely large number of required CPU cycles.

 

Purpose of this web site:

This website has been developed as a research project for a class taught through Stanford University’s Freshman and Sophomore programs office entitled Sophomore College.  The class was entitled The Intellectual Excitement of Computer Science and was team taught by Professor Eric Roberts of the Stanford University Computer Science Department and President John Hennessey current university president and professor of Electrical Engineering.

[Home] [What Is It?] [History] [Future] [Concerns] [Efficiency] [Curr. Projects] [Resources]