An Advanced Course on Computer Networks

In Spring 1991, a new advanced course (CS 740) was introduced. The pre-requisite for this course is the introductory course described above. In the future, a course which covers performance analysis and simulation may also be a pre-requisite. This will make it possible to discuss a wider range of papers.

During the course of the semester, each student gave two lectures on papers chosen in consultation with the instructor. In most cases, students read additional papers to broaden the scope of or add perspective to the material discussed. Students used transparencies to assist with their presentations and all students in the class received copies of these transparencies at the time of the lecture. Written guidelines on technical lectures and transparency form/content were given to the students at the beginning of the semester. In addition, the instructor commented on each student's transparencies prior to the lecture.

Each student was required to "review" half of the papers presented. This involved completing and submitting a standard conference review form (as used by program committees). As part of each review, students had to comment on strong/weak points, major research contribution and suggestions for follow-on work. In addition, a recommendation for acceptance, acceptance with changes or rejection had to be given. At the end of each presentation, students submitted their reviews. One was then chosen to comment on the paper. Others were encouraged to comment on the paper or on the comments of their colleagues. Over the course of the semester, there were a number of lively discussions on the merits of various papers.

Requiring written reviews and then expecting those writing them to, in effect, serve as rapporteurs is very important to the success of the course. Informal seminars often suffer from the fact that many attendees do not read the papers being presented. The reviews served to insure that this would not happen and that each student would read and think carefully about at least half of the papers presented.

It was interesting to observe how student perspectives matured during the course of the semester. At the beginning, almost all recommendations were for acceptance. As the semester proceeded and students began to understand that publication does not automatically imply perfection, or even high quality, and that not all published papers deserved to be accepted, they began to be more willing to criticize. The inadequacy of simulation experiments and of the explanation of results in many papers were areas where criticism was common. In a number of instances students (correctly) disagreed with the validity of the results. They were also surprised by the (unfortunate) propensity of some to publish the same results multiple times. For most students this opportunity to critique published papers was a new experience and most thoroughly enjoyed it.