[Top of Page]
JB: Welcome to the CREN Virtual Seminar Expert Series for Spring of 1998 on Campus Communications Strategies. Whether you are joining us by phone or on the Internet, you are here because it's time. This is Judith Boettcher of CREN, one of your co-hosts for today's session. And today's other co-host is Greg Marks of MERIT. Thanks for being here today, Greg.
GM: Well, thank you. It's good to be back and it's nice to have a couple of other folks on the line with us.
JB: Okay, great. Our guest expert today is Mark Bruhn from Indiana University. Mark is well-versed in network security. He's currently in charge of technology policy development and education in the Office of the VP for Information Technology at Indiana, and also a member of the Security Office. Welcome, and thanks for being here, Mark.
MB: Glad to be back, Judith, and this time, I've brought along a real expert.
JB: Well, good! And do you want to introduce him, or shall I go ahead?
MB: No, go ahead.
JB: Okay. Welcome also to Brent Sweeney, a lead network engineer at Indiana, who works closely with Mark and as Mark just said, is obviously a real expert. Brent works in the Network Operations Group and worries about network topology and the various components of that topology. Thanks for being here, Brent.
BS: Thanks for inviting me.
JB: Okay, Greg, do you want to review how we do these things?
GM: Yes. So that folks are clear, we have two options. One is that you can dial in by phone, I should say, and join this call live. When your phone call connects, you'll be right here on the air with the rest of us. That number is 734-763-1533. Or you can join via the Internet at www.cren.net. Either way, to ask questions, you have two options. Again, you can call us and ask the question, or from the CREN site, the web page, there's a spot that you can click that will send e-mail. Or you can simply send e-mail to ccs@cren.net, and we'll be happy to answer your questions while we're here during this session.
JB: Thanks very much, Greg. And this is our second series of Campus Communications Strategies Tech Talk, and remember that if you do miss a session, you can replay the sessions from the net at your convenience. Let's go ahead and get started on a topic today. We find that we have certainly enough questions and things to talk about for actually many events, and with network security. This is a topic of growing interest on university campuses and generally. We are learning to live with some of the risks, of course, with doing business with students over the net, but knowing just what risks to take and how to minimize those risks but maximize the convenience of the web is really tough. Mark, any initial thoughts here? Are you doing business with students over the web at Indiana right now?
MB: Absolutely. First off, I think we all understand that access to the Internet by families from, you know, households has exploded, and the number of Internet service providers has dramatically increased over the last couple of years. What this means is that prospective students and their parents don't have to wait for written material or on-site tours of our schools any more. They can and they want to visit campus Website, see what's available to them there, at the various universities before they even leave home. A school's Internet presence, in fact, will probably be the first impression that a family gets of a particular school. In addition, you know, they want to be able to do things from home like submit applications, other materials. Some of the offerings are certainly different for our different campuses, but as I look at our Bloomington office of admissions page, for example, I see there's academic information there for prospective students, you know, first-time college students, information for transfer students, information for graduate students, information for international students. We have a lot of international students here at IU certainly. About IU, about Bloomington, there's an application for admissions online there at that site, and there's a virtual campus tour that students and parents can take. Now, once they get here, they're going to see that we have most of the dorms wired. They can do things from their rooms where before, they may have had to fight crowds, you know, at things like registration or course exchange. In looking at our online student services site, which is called InSite, and that stands for something, but I don't remember right offhand what it is, I see things like advising. They can get advising reports. And what's really neat there is they can get online simulation of what adding courses would do, what different grades that they attain in certain courses would do, and how different hours might affect their schedule. They can get a bursar statement there, up to the minute account status. They can change their current or permanent address. They can see course offerings, up to the minute financial aid awards, tracking of--and this is kind of neat--financial aid applications, because you know how many steps that thing has to go through. And they can get a record of their total educational borrowing through the day. They can also get grade reports, transcript information, they have a very nice way for the student to view their semester schedule in various formats. That way, I think we talked about last time, if they don't want to take a Friday morning 8:00 class, they can adjust and do that visually. Students can also register at IU Bloomington and a couple of other campuses by touch-tone telephone as well. Obviously, other schools are doing the same kind of stuff, and some certainly more aggressive than we are. But we have to keep up, because obviously, we're still competing for that same tuition dollar.
BS: One of the other interesting things about it is the students can see the kinds of support they'll have once they come. If you're really interested in coming to a place, you want to see what it's going to be like after you get there, too, and you can see a lot of the same kinds of support things on line.
GM: You talked about a lot of things there that students would be concerned about either the privacy or confidentiality of the information or its integrity, that someone else can't mess with it. And so your comments provide a whole lot of lead-ins to the kinds of things we're going to talk about this afternoon.
MB: And in fact it shows that we've at least instilled a certain amount of paranoia in you, Greg, to come up with that question. That's a good thing.
JB: Just going back to all of those initial interactions of students that you both mentioned, are there any concerns about whether the students who are actually applying are, in fact, real students or in fact that perhaps their transcripts or other supporting materials, that there's integrity in that content?
MB: Well, now, what they're doing there actually is they're applying for an admission packet, so what they're going to get then in the mail is information about what they need to provide. And as I recall, they don't accept the admission fee via that page. So it's really an application for an application, if I understand it. I haven't spent a lot of time looking at that, but that's as I understand it. Now, from an identification and authentication standpoint, most of the things that a student can get to while they're here, that is, after they're registered, is accomplished with a student PIN number. And this is fairly common now for schools that provide this kind of a service. At registration, the Registrar's office assigns them a PIN number, and that's on their registration ticket. They're told what that's for, and they change it, and then subsequently, they provide their student identification number, their student ID and that PIN number on these other services.
JB: Okay. You know, you've managed to mention the fact that there's a real demarcation, I think, before the students actually get on campus and then the student once they are actually on campus. So in terms of dealing with network security, then, maybe you want to address how you talk about both that internal security and perhaps the external security. Or how, in fact, are those areas demarcated? Do you use firewalls, for example?
MB: You know, there's still a lot of discussion about firewalling. Obviously, vendors of firewall hardware and software will tell you that you need several large ones, but two issues are really involved. If you have an enterprise front door, if you will, in the form of a firewall, that, then, becomes a single point of failure. So you've got to really think about redundancy and that sort of thing. The second is that it's also probably a single point of attack, so if someone happens to compromise or exploit problems with that firewall device, then you've given them the whole set of keys instead of a key, so obviously, those two things really have to be considered. It's probably a better idea to have various security perimeters, and what that means is that you define certain services, certain sets of services within your environment, and then you decide what security around those is necessary. And that leads me to this, and this is that you shouldn't just willy-nilly implement firewall software, firewall devices. You need to get a group of people together and you need to talk about what services you provide to your internal users and different categories of internal users and what services that you want to provide or have visible to users outside of your institution. Until that is done, until people understand what the access policy is, then no one should be running around installing firewall devices. You could put those in the wrong place. You could be protecting the wrong services and get yourself in big trouble that way. Brent, in fact, may have other opinions in this area.
BS: Well, I was going to say that in a way, a fruitful way to look at it may be to think of a toolbox of firewalling functions that you apply in the right places as appropriate, rather than only one way. Use sort of one hammer to hit all nails with, which is the way some people think. Probably the people who least understand firewalls would think of them as sort of being the kind of thing that you could use to apply to any kind of need.
JB: So are you saying, Brent, that we may then have like a matrix or a relationship between the services that are being provided and levels of security that you might want to have?
BS: Sure, and also different kinds of challenges. You'll deal with different kinds of challenges perhaps in different ways, and on different kinds of platforms and different kinds of applications and different kinds of access and so on.
GM: What do you do in your student residences and other areas where you, let's say, if you have any public sites, for example. Do you see those as areas that you have special security concerns, students compromising the integrity of each other's work or needing to watch those as the sources of attacks against other parts of campus? How do you look at your student areas?
MB: With suspicion! Let me say this first, about that. You know, with such a large, diverse student population like we have, you know, we know that of the 95,000 or so students, some of those--well, let me put it this way. They may have an inclination toward inappropriate behavior.
BS: And too much time on their hands.
MB: And too much time on their hands, right. Opportunity is what we call that.
BS: Well, it's also that in the university, you've got an open environment. You don't even know that the person walking up to one of your work stations is one of the members of your community.
MB: That's correct. Now, in our student technology centers, we have taken a couple of different steps to isolate the activities of students from each other and from other offices that may be nearby, for example. We separate the networks from administrative office networks. I think in all cases, Brent, we have the student technology centers on their own network subnet.
BS: Yeah. They're specifically isolated.
MB: Right. Now, we also within those, used switched network technology so that they at least appear to be end to end and one student on one device, then, cannot view the activities of another student on another device in that same cluster. Also what we've been doing over the last year or so, maybe a little bit longer than that, is that every device in our student technology centers which we used to call student clusters--you've got to rename things on occasion to keep people on their toes--those devices now almost all or all require a login. And what that allows us to do is if someone does something on those devices, we can go back and we can check login records and find out who was on that device at that particular time. That's become very helpful to us. Obviously, students are using Netscape mail to send, well, even death threats, but most often, harassing, abusive messages to other students, to faculty members when they don't get the grade they expect on some report or paper. You know, we can use that information now to track that activity and point at somebody. And we work with the Office of Student Ethics or HR, if it's a staff member, or Dean of Faculty's University Police to address those problems. Now in some areas, though, there are still devices that allow anonymous access. We have some areas in our very large student union that are providing services to the students that don't require that login. There are other places on campus that don't do that, either. Our library, for example. You know, one of Indiana University's missions is community service, and if a citizen of the state of Indiana walks into the library, you know, they can check out something from the Indiana University Library. We're not going to be able to create user names for however many million people there are in the state of Indiana. I don't know how many that is. So we're talking with the technical people at the library about options on restricting the capabilities on the devices that they have. You know, they've got a patron database which could be a source of identifying those people as well, because obviously, even if you are a citizen, you still have to have a library card or identification to take that stuff out of the library. Otherwise, they may never get it back. We've also talked some about the dorm room computers, and the halls of residence computing services area, which is not part of the central computing department, maintains the networks in the dorms. So obviously, that is something that we've been working on, too, because what that does, that gives the students an unsupervised private place to exercise their inclinations, as we talked about a minute ago. I mean, we've got, again, students mail bombing other students, distributing copyrighted material from FTP servers in the dorm rooms, and you know, the music associations aren't overly enthused about our students doing that. And we've been contacted several times about that and have had to take care of those. Just recently, in fact, we were able to get a search warrant for a dorm room because a student was mail bombing another student. And when we went in there and got that PC, it had a copy of Avalanche on it, which is a common mail bombing program. We've also talked here about private versus public networks, and we've not really done too much about this except talk, but it's an interesting idea. When someone places a device on our network, maybe we should label that either a private device or a public device. And what that means is, if it's public, it's illuminated or visible to the Internet. If it's a private device, then it's only visible to maybe even that department's network subnet.
BS: Actually, we've done a little bit with that, yeah.
MB: Now, of course, the discussion would come around to which of those is the default. In fact, if the default is private and someone wants to make it public, given the definitions that I used a minute ago, maybe they then have to satisfy certain security criteria as well as administrator certification or physical security, whatever it might be.
GM: Is your intention to link that to your authorization scheme so that you don't give people different ideas for different of these viewable areas? It's just what they can see is based on what they are authorized to see?
MB: Well, we don't have at Indiana University a central authorization mechanism.
GM: Well, that makes it more interesting, doesn't it.
MB: Yeah, doesn't it, though? It gives people jobs too, because the authorization mechanisms are on the various hosts, so obviously those group files or those file protections or host security packages, whatever they might be, have to be administered and managed by somebody. But gosh, you know, in a perfect world, there's a central authentication and authorization service that applications can interact with, people can interact with, you know, however that needs to be. And if you're a grades clerk in an academic department, then this device, this intelligent device knows that's who you are and has this list of things that you're capable of doing within the campus environment.
BS: So far you don't have a whole lot of agreement between different islands of administration on what sufficient identification is, so you've got people in the computing center using something like a Radius server with its own kinds of authentication, identification, and you've got people in the Registrar's office who'll just take money. And halls of residence, they might want your food services ID card and things like that, so we're still a little bit away from being able to have a universal way of identifying yourself to all of the different stakeholders in the university's service provision.
MB: And even grouping people. A prime example that we use around here is that an advisor at IUPY is not the same as an advisor at Bloomington. One of them, if you're a faculty member, you're an advisor. I can't remember which one, usually, but at the other one, you have to actually be an advisor, even though you may be a faculty member. You have to be labeled as an advisor.
GM: Do any of these campus environments exchange information about who's been authenticated?
MB: Campus, you mean, within the Indiana University System?
GM: Yeah.
BS: I think he meant the island that I was talking about.
GM: Yeah.
MB: You know, actually, given our recent reorganization under a VP, we do more of that now than we ever did before. The IUPY campus is using the same authentication mechanism that Bloomington is.
BS: For computing.
MB: For computing, right. And two other campuses, one attached to IUPY at Columbus is using that same mechanism, and one at Richmond, Indiana, attached to Bloomington, is using that same mechanism also. So I don't know, four out of eight isn't bad, I think.
BS: I think also kind of tied into that, for instance, we use some of the PIN authentications in order to get initial network authentication. We use lists from the Human Resources department who keeps track of all the people who really, really do work at the University to generate and re-vet the lists of staff members and similar things for faculty members, and similar things for members of the student community and so on. So there is some interplay and some connection between those different groups, but they don't all use the same authentication method right now. And I'm not sure how they could right now.
MB: But gosh, wouldn't it be nice?
BS: Yeah. It would be great if everybody would just accept a fingerprint or something like that, but --
GM: And do you think that, to make progress on that sort of thing, there's going to have to be either a disaster in the awful sense of some kind of security violation or a disaster in the sense that they reach a point where, for financial reasons, they really need to do something and discover that there's a huge roadblock to getting it done?
MB: Well, but what you just said is two different things. The first is, will it take a disaster to centralize and strengthen system-wide authentication? I don't think so. What I think that might do if something like that happens is maybe strengthen these pockets of authentication, that is --
BS: Or further Balkanize them.
MB: Right. It may make them stronger.
BS: More defensive.
MB: Where they are, but not necessarily force a central mechanism.
JB: In terms of how important it is, from what you're saying, is for universities and organizations to move towards that sort of model where, in fact, people can be authenticated, they can be authorized to use certain things, is there any basic things that you would think that everyone ought to be doing right now or at least have as part of their plan, the near-term plan?
MB: Well, the very first thing that comes to mind we talked about a couple minutes ago. Having users with the capability of doing anonymous things is not a good idea. Obviously, for a couple of reasons. If somebody does something bad within our environment, then obviously, we want to know who that person is. There's an accountability factor. The second is that there's a legal liability factor. If one of our students breaks into a machine elsewhere, someone elsewhere is going to expect that we're going to take reasonable steps to take care of that problem. And if we can't find that student or we can't figure out where that thing happened, then we're not going to be able to help. Now, the converse of that, of course, is that if someone breaks into our computers or network, or if someone sends abusive mail to one of our community members, we would expect that the people at the other end are going to help us figure out who that was there as well.
BS: And take it seriously.
MB: And take it seriously.
JB: Well, that brings us perhaps to another question that we wanted to talk about a little bit, and that is, what are some of the most common kinds of security problems that you are experiencing on your campus today? Or what you're hearing about from others?
MB: Yeah, this is Wednesday, right? Well, you know, this is kind of interesting because I spoke to our University Police Department advisory board, I think, Tuesday last week. And in trying to figure out what it is that I wanted to get across to that group of people, reasonably high academic and administrative people within the university on that board, I knew I had to make it interesting. So in preparing for that, maybe I prepared for this as well, and I put together a list of what I called Incident Headlines. Very juicy little headlines with a little blurb at the bottom explaining what they were. And I can cite some of those that were more interesting, cite some of those that were more serious. One that we dealt with just recently, we had an account on one of our main academic computers that we were sure had been broken into. This person was collecting hacking tools, this person was probing other systems elsewhere within our environment and out on the Internet, so we thought, "Well, you know, we need to watch this a bit." And we collected information. We watched what this person was doing. We watched the processes active under that account, and we came to the conclusion that account was broken into. So our first visit, when we discover something like that, is with our University Police Department, and they're becoming actually very more active. I think we talked about this a little bit last time, last session. So we decided that we would put a watcher on that account, and if someone logged into that account, then we would be paged while we were sitting there talking to the Police Department, because we knew they logged in at a certain time each day. Well, I'm going to shorten this story and say that in fact, what had happened there was an IU graduate student had given access to all of their IU accounts to their two teenage children. And the older of those two teenage children was using that account to learn all about hacking, and in fact, had experimented and had been probing other sites. And we had a very interesting conversation with the parents. That one, in fact, is still open. Those accounts are all still disabled. Another one is that we had a student that was apparently providing a mail-bombing service to other students from their dorm room.
JB: Oh, dear, that's a new one.
MB: And then we got some evidence, we went downtown, we visited with the prosecutor. We went to the county court, we got a search warrant. We went back to the dorm room. I mentioned this before, earlier in the session. We seized that device. Obviously, all of this with IU Police Department. Took the device back to the department, went in, looked at it, found an unbelievable array of different tools, mainly a mail bombing program. As far as I know, that case is still pending with them over there. Another problem that we see quite a bit is copyrighted materials on dorm room computers. This problem is getting worse, and it's getting worse not necessarily because more students are doing that. I think that's the case, but the problem is getting worse because the people that are interested in that not happening--music associations, publishers, record companies, all those places now have dedicated staff looking for that stuff on the Internet. We have gotten over the last three or four months probably eight, I think, through the President's office and through the Vice President's office, complaints from those kinds of agencies, and in one of them, from the RIAA, they even say, "We've got people looking at this and we found this site at your institution. Please take action." So obviously at this point, we're gearing up. We're increasing our awareness program and sending material to the students that might do that in their dorm rooms, and taking care of that problem. But another big problem we have is Spam relay. Our people generate enough Spam from within our own environment that have to deal with, but one of the big problems on the Internet right now is that mail hosts may be--I can't say mis-configured because some of them are probably configured this way on purpose--but they allow other agencies to send mail, thousands of pieces of mail, through those mail hosts to other destinations on the Internet, and you know, there's some, I call them, Internet vigilante groups out there that are looking at this very seriously. There's some blacklists being maintained that institutions can use to keep that from happening. Well, we've had to change several of our services to the consternation of some of our users, certainly, because we've got people attaching to ISP's and then interacting with other users through IU's resources. But we've had to disallow that mail relay on our servers. So the choice there, of course, to our users was, well, either we're blacklisted by n number of sites and you can't interact with people at those sites, or you have to change the configuration of your PC not to use our relays to send mail to other people outside of IU. So that one's been big over the last month, in fact.
GM: We should pause for a minute here and see if there is anybody listening in, either via real audio or is on the phone call with us that would like to ask any questions.
JB: Maybe in terms of the questions, maybe we'd like to just respond to the person who had written in, I believe, just a little earlier asking about whether our experts could recommend a good book for the firewall administrator that might deal with some of these issues.
MB: Well, let me say two things about that. One, we do not have a firewalling strategy or a firewalling policy at Indiana University. So I guess, call that a disclaimer, I don't know. The text that's thrown around most often related to firewalls in general is the Firewalls and Internet Security text by Cheswick and Bellovin. I was going to jot down the ISBN number, but I didn't see that. But do a search on firewalls on the Internet using your favorite search engine and that text will more than likely come up a couple of times.
BS: Also some nice firewall lists.
MB: Correct. And in preparation, I was supposed to write those down, and I didn't do that either.
JB: Well, maybe we could post that on the Website afterwards.
MB: Let's do that.
JB: Would you mind perhaps repeating that title just one more time?
MB: Firewalls and Internet Security is the name of the text.
JB: And the author on that one was --
MB: Cheswick and Bellovin. Now, MIST also has a publication and I think it's 800-7, that's actually an excellent tutorial--a beginning tutorial, to be sure, but an excellent tutorial on firewalls and what they do, different kinds, that sort of thing, too, so listeners might want to take a look for that also on the Internet.
JB: And they could find that, you think, by just doing a search on that?
MB: Absolutely. That's what I do. If something comes up, the first thing I do any more is go to Alta Vista, which is my search engine of choice, and take a look. And you can get an unbelievable amount of information. Now, we don't have any information about the specific product that the person mentioned in the e-mail. I think we may have an evaluation copy of that, but we haven't done anything with it. But again, what I would do is I would take the name of that and I would search on that on the Internet and you will get bunches of information.
JB: They might find a review by someone else, in fact.
MB: Comparisons is what you'll find.
BS: And discussions of features and so on.
JB: All right, very good. Joel, are you still with us?
J: Yes.
JB: Do you have a question?
J: I don't have a question. I have Mike (inaudible) here with me. I don't know if he's got a question.
M: Actually, yes. We've had some of the problems with break-ins from external users to our campus network. Obviously the system of choice usually has been to try and break into UNIX systems. One thing that has me paranoid about the future, and I don't know if anybody else has seen this type of attack, but all of our hubs are intelligent hubs. They all have IP addresses, they're all manageable by SNMP. And I'm kind of paranoid that someday that's going to be what people attempt to do is break into your infrastructure and bring down switches, bring down hubs, etc.
BS: Or even just turn off ports and things like that. Yes, that's something we thought of. And in fact, we do have something. You know, Mark talked about how we don't have a firewall per se, but we have a lot of firewalling functions that we've talked about. One of the things that we do -- couple things we do. One of them is that we block SNMP access into our campus altogether.
M: Yeah, that's what I'm doing using our Sysco router, using standard tables on it.
BS: Sure, you can do that.
M: Well, today, in fact.
BS: Another thing that you can do, or at least that we do is that equipment that has some sensitivity with regard to network management in particular and some of the networking equipment that's vulnerable in those kinds of ways, that you don't even want people poking at, it's possible to use private addresses that are routed within your campus but not routed anywhere else. At least that keeps the traffic from them being able to be returned. It doesn't keep people from being able to get to them if they can do source routing, but then you can turn that off, too. So there are a lot of things that you can do to at least limit access to those, and you can do even nastier things, if you want to.
M: Okay, thanks.
JB: Good. Mark and Brent, maybe this is also a good time to raise the issue of just how do we go about providing access to content resources over the network, given all the challenges that we have. You at Indiana have this fantastic variations project that I've been reading about, and certainly that's a resource that we will want to have some way to make that accessible. Or you will want to, I suspect, make that accessible certainly around your campus, if not more broadly.
MB: Well, you know, most of the time when you license an information resource or contract with a provider, what they're interested in having assurance of is that only members of your community can get at that resource. And you know, generally, that's relatively easy to do for a campus. We talked about this a little bit before, you know. We use the same mechanism at probably four campuses. But as long as the people that are accessing that resource, whatever that might be, are required to identify themselves as eligible members of the community, then usually that suffices. Then what happens is, if a couple of universities want to get together and talk about that sort of thing for cost savings or because one university might have the expertise, the capability of supporting that while the other one doesn't, whatever it is, that might be relatively easy as well between a couple of different universities because you can get together and eyeball, and even if the two authentication schemes are different, maybe you can negotiate some common element, some common way of doing that. But one of the things that we've been talking about within the CIC--and if I get this right, that's the Committee on Institutional Cooperation, which is actually the Big Ten universities plus University of Chicago and a couple of laboratories. What we've been talking about in some of the security meetings that we have involving members of that group is how, if the CIO's of those institutions get together and decide that they want to pool their money or what have you, pool expertise and contract with an information delivery source for one database or one copy of a database and then all the schools access that database, even if the contract stipulates that only members of those institutions--you know, the 13 universities, colleges that are involved in the CIC--that in and of itself is a problem because what does that mean? A member of the Northwestern University community might be completely different than what we would think a member of the university community would be. They might include all alumni--and a disclaimer. I don't know how Northwestern addresses this. I'm just using that as an example. They might include all alumni, where we wouldn't do that.
BS: They might include people at their research park. They certainly would include people visiting in Finland. Or something like that.
JB: It sounds as if we're going to have to do a much more in-depth analysis of just community and relative benefits of each of those communities.
MB: The credentials of one university have to be in some way acceptable by another, and our method of authentication might be completely unacceptable to the University of Michigan or Penn State, or what have you. We might be very lax in that area, where they're very secure, or the other way around, which I guess I would like to think. In any case, there has to be some agreement as to what that really means. In addition, if it's restricted material that the institutions are sharing, if you move away from library holdings, for example, though there are those there also. If they're going to share student information, can that be done? There are laws restricting access to student information. The law stipulates that there has to be a reason, obviously, and the student has to give permission. Well, how is all that managed? And then the trust in authentication methods at other universities becomes even more important because if it's Indiana University student information that we're passing to another school, Indiana University is probably still liable if that student decides that that was an inappropriate sharing or if that material is disclosed inappropriately in some way. So we have to be assured that Michigan or Illinois or Ohio State or whoever it is treats that information as we would want to be treated, or as we treat it here. One of the ideas that people have been throwing around within the CIC context is the idea of a central, what we've been calling a multi-authentication gateway. Meaning that there's some virtually central mechanism that takes credentials supplied by a university and then analyzes those credentials and passes those on to other participants in that particular product or that particular sharing project. That's a little tough to do, too, because there's also religious wars as it relates to different authentication schemes. DCE comes to mind. There are several universities that every sentence they use has DCE in it somewhere. There are others of us, and Indiana University is one of those, that isn't completely enamored of that particular structure, that particular security service. So, you know, you have arguments about that. Well, if you have the central entrusted API, if you will, or gateway, then that technology doesn't really matter. If we authenticate our users in some way and we pass that credential to that gateway, then the gateway can then convert that into whatever it is that another institution might need, and the authentication sharing is done that way.
JB: So that means that not everyone would have to have the same kinds of authorization schemes, then?
MB: Bingo! That's exactly right. And then, you know, you're not arguing about what technology to use. The central hub converts that stuff.
BS: Like a directory server, in a way.
JB: Okay. Great idea.
GM: We're getting close to the close here, and we do have a question that's come in via e-mail from Alan Winnery at the University of Hawaii. It's got two parts, two quite different areas. One is, is it typical for there to be staffing at universities specifically for security event response? And the second one is, what are the legal issues such as privacy that exist around direct monitoring? For example, sniffing via TCP dump by a network administrator.
JB: Good questions.
MB: Excellent questions, in fact. Let me address the first one first, which would be good, I guess! Yes, it is now very common for universities, at least the ones that I interact with, the representatives that I talk to at conferences and what have you, to have staff dedicated to security and most of them, then, have staff dedicated to incident response. I was putting together a contact list for the CIC schools, as we talked about a minute ago, and fully half of those automatically included their incident response mailing list, whether that be Assert or Incident or whatever you want to call it. And on the other end of that, there are staff, then, that take those incidents, analyze the situation collect information--and I'm segueing into the second question at this point. Collect information, and hopefully what they do, and what we do is then put that packet together for whatever university governance body needs to be involved. Percentages would tell you that the majority of that goes to the Dean of Students' office. So I guess that answers the first question. The second part of that was privacy related to collecting information, I assume related to that. And that's a touchy area. What we do here is, we do no content review. Absolutely no content review unless there's a complaint. In other words, we do no proactive systematic review of Web pages, we don't look through e-mails, we don't look through computer files of any kind, nothing that's attributable to a user do we look at unless we're directed to by university administration, because we assume that they know what they're doing. You know, the president's office or university counsel, something like that. Or there's a court order handed to us, naturally. But what we do do, is we will use system log information, we will use e-mail log information. I mentioned that we use login records on our student technology center PC's from the server that governs those. We use that information freely because, obviously for the system stuff, we use that for performance monitoring, performance tuning.
BS: In a statistical way.
MB: In a statistical way, an aggregate. But also, if we get complaints that some particular person sent an e-mail, we may in fact go to the e-mail logs and see if that e-mail with that ID, that message ID, came out of that account. We don't look at the contents. We don't care what the content is. Somebody does, but we don't. And then we can verify that it came out of that account. Now that particular instance is fairly rare. But what we do also use the system logs and that sort of thing for is to see who has signed onto a particular device or a particular system at a particular time.
BS: We save lots of things retroactively that we can go back and look in logs and so on, but we don't usually scan them automatically.
MB: Correct. Now, as far as proactive intrusion detection goes, there are obviously ways to do that and there are very specialized products now. I think the one that we were looking at a short while ago is called Internet Flight Recorder, IFR. Where it will actually do a lot of stuff for you related to that. It will collect a lot of different kinds of information like Web page accesses in and out. It will track e-mail packets or e-mail headers as they go by, and does things like that. Well, the first time that that particular product was mentioned to me, I said to the guy, "Well, let's not, and just put that away, because that's not where we want to go."
BS: But there are also some network monitoring tools that look for attack patterns and things like that.,
MB: Exactly, and from an intrusion detection standpoint, that might be where we want to go. Checks for patterning, things like --
BS: Denial of service attack
MB: Right. I was thinking of SATAN, for example, of someone scanning your system using that tool. That leaves a relatively easy to find footprint. ISS does the same kind of thing. In any case, you can see when things happen and that kind of thing, certainly, I think that we would be happy doing. But gosh, you know, looking at e-mails and looking at Web pages proactively is a real problem for us. We will respond to complaints, but that's about it.
JB: Well, just to kind of close that question off, then, if I can interpret what you're saying, as far as you don't really do anything pro, actually. You wait until there's a problem and then you go back and do tracking afterwards.
MB: Yeah, generally that's what we do. Unless it's statistical collection and performance tuning and that sort of thing.
JB: Okay.
MB: Now, you know, we do filter on things. We do, like I said earlier, we put in rules in our mail relays that disallow Spamming or external to external mailings and that sort of thing, but I'm not sure if I would put that in the same category.
JB: Okay. I think that Alan will probably find that information all very helpful. Let me just scan very quickly. We're really actually past time here, but does anyone have a final question that they'd like to put to our experts? Greg, have you got anything at your end?
GM: I'm amazed at how much we've covered and how much remains to be covered. We could get an entire series out of this.
MB: Gosh, I've got 15 pages of notes here that we didn't even get to.
JB: Right, and we didn't even talk about IPv6, either.
MB: That's right. And Brent was right here for that purpose.
GM: Well, we should have people send us e-mail. If they'd like to hear more and you're still listening, send us e-mail and tell us that we should continue this session.
JB: Is there someone on line?
KC: Actually, yes, we are, and if you can continue, that would be great.
JB: Would you like to identify yourself?
KC: Kerry Castle, Morris Ramsey from University of Arkansas for Medical Sciences.
JB: All right. Very good. Well, given that, shall we go ahead and talk about IPv6 and what we have there? We were going to ask our experts about just what is the status of the new protocol that's coming along, IPv6, and what should we be watching for with that?
BS: Well, one of the reasons that came out, we were talking about a piece of IPv6 that's been recently broken out called IP SEC that's released by the IATF. It's made some recommendations about how to do authentication. They allow a couple different means. It has some ways that look like they're being picked up by the different network vendors, so they'll be providing hooks into applications as well as equipment. It looks like it's a good way to do it. Right now, the ways to do that are largely proprietary.
JB: Mark, maybe you want to give a little bit of background on IPv6? That's a new protocol. It is release now, right?
BS: Part, yeah. I mean, IPv6 is the standard that's been out for some time, but mostly people are not doing very much with it. One of the things that we're seeing is that some of the advantages of it are being picked up separately and being integrated into existing products, and that's what's happening with this IP Security layer.
JB: Okay. And so vendors are taking out pieces and incorporating it into their products?
| BS: Well, not only vendors, but other standards groups like the IATF. As I say, IP SEC is an IATF standard that has provisions, hooks that people can use independent of the rest of IPv6 which has a little more baggage than some people are willing to endure at the moment.
GM: Can you talk a little bit about what capabilities it provides for? Maybe some of the ones that the vendors are pulling out and using independently?
BS: Well, the nice thing about this particular thing is that it's looking largely at authentication, and doing that in the standards based way. And I guess that's the main thing that I'm looking at right now. There are a couple different authorization schemes that they use, one called SKIP and another one called ISAKEMP. ISAKEMP is the one that the IPv6 standard specifies, but the IATF is allowing either one of them at the moment. And they both have to do with certificate kind of authority and then key verification and so on. So, one of the things that you always worry about, and we talked about this at the very beginning, is how do you know who to trust and how do you know which other institutional methods of trust do you trust? It's kind of the chicken and the egg problem that you're always having with security. For instance, the other mechanisms like Kerberos have to deal with.
GM: We have some additional e-mail coming in. First the note that George Markham says hi to Kerry, and secondly the--
MB: What is this, Prairie Home Companion?
GM: And Steve Thompson would like to hear some more on two topics, IP spoof prevention and denial of service.
MB: Gosh.
JB: Have we stumped an expert with our spoof question?
BS: No, those are really big areas. This question immediately caused denial of service! Obviously, the question with spoofing is someone's pretending to be someone that they're not, usually so that they can do something that they'd be trusted if they were someone else. But being who they are, they're not trusted. So they try and pretend that they're the institution or the host or something like that. Lots of these things are host based. And there are some ways of doing that. It used to be, in the olden days, that people would just think that routing would take care of that, and there's some very clever ways of getting around routing to look like you're someone that you're not. There are many places that don't verify who or what kinds of network addresses and what kind of packets are coming from them, purporting to be from them, when they're really looking like someone else. One of the important things that people do is a kind of good citizenship thing, is to make sure that you block what look like spoof packets from your own institution. This is an important thing that everyone needs to do, and as you do that, you make the world sort of better, safer for everyone else.
MB: And related to that, if you do have a--for lack of a better term, an enterprise firewall, one of the things that you really should do is block traffic from outside that looks like it comes from you, and from inside that looks like it comes from someplace else.
BS: So that's the kind of classic anti-spoofing sort of firewall approach. You block the stuff so that the spoof -- looks like it's coming from the outside world, but it says it's from inside, and so you say, "No, no, no, I know that's wrong." And so you throw it away. And in turn, you also block your own people from spoofing by blocking things coming from your own institution that say they're from somewhere, from Yale, for instance.
MB: Now, denial of service, you could list a hundred different ways that's done. The most recent high visibility situation was the case where somebody, and apparently at this point no one still knows who that somebody is, attacked personal computers all over the place. Government agencies, a couple of Big Ten universities were hit kind of hard.
BS: Government labs.
MB: Government labs, and those affected NT work stations and Windows 95 work stations. And the way that those things worked, there wasn't anything that remained on the machine to use to determine the origin of the attack because generally what they did was they caused the machine to hang or you got a blue screen error and you had to reboot, so anything that was stored there, you lost. That was the most recent case. But, gosh, I don't know what the question is completely, but even -- well, I had a case this morning where a student was running a process on one of our central servers. Let me just say what they were doing. They were trying to crack RC 5 because they were invited to do so by the people that own that. You know, there's a $10,000 reward for somebody that does that. Well, they were using 99.9% of that processor. That is a denial of service attack if you think about it purely, because no one else on that machine was able to do anything while those cycles were being consumed by the other process. So you can throw that into the category of denial of service.
BS: But there are kind of different strategies for every kind of denial of service, if there's so many different kinds. Generally, what's being done nowadays is to issue warnings and alerts and work-arounds for each specific kind of denial of service. Many of them are things that may work for other things, too. There are some firewall and other kinds of intermediate devices that will identify certain patterns of denial of service that we've talked about, and take some kind of proactive action automatically for them. Some of the firewall products will do that, and even some of the router vendors have introduced some of those things now, that are just part of the routing code to identify and interrupt certain kinds of attack patterns.
MB: Our security Website, also, for example, kind of redistributes patches for --
BS: Sure, that's an important thing.
MB: The out-of-band attack (inaudible) which was what we were saying the other day, the bonk, the teardrop to -- I'm trying to think down that list. But one of the important things --
BS: Land.
MB: Land. Thank you. But one of the important things that we can do, that is, security units, security people, technical people, whoever is capable of doing this or in a position to do this is to provide that kind of resource for the users and their community.
BS: And to make it appear that it's important again. A lot of people, especially in the academic community, don't get around to it. Patches are weak.
MB: Right, and what they do, they're busy, they're trying to get payroll in and they say, "Well, it didn't happen to me yet. Why would it happen to me?" and then they press on. So what we're trying to do also is to increase the awareness. Fix it now, and then obviously you don't have to worry about it. You don't want it to happen when you're trying to print W-2's, you know. A lot of people are going to be angry about that.
GM: I think we have one more question and I think probably with that question, we ought to end. It's from Carl Castle and he says -- I'll give you the whole question.
JB: I think it's a she.
GM: Oh, it's Kerry! I'm sorry, I wasn't looking carefully. Do you allow modems on individual network PC's? Downloading from bulletin boards? How do you prevent your institution from becoming a de facto ISP?
JB: Good question.
MB: Well, the answer to the question is no, because you have two categories of kind of personalized gateways into your network environment. One of those is an active modem on a desktop. The other, though, is a multi-user work station on a desktop. Either one of those can be used as a gateway into your environment, depending, again, on how the administrator of that particular device has set that device up. If you have a dial-in, you have a modem, an active modem, a listening modem on a device, it does not matter how many firewalls you use. Unless you do it on your voice circuit, nothing is going to help you there. That's a problem. But if you think about the ramifications of issuing a policy denying that capability --
BS: At a university.
MB: In a university environment, one is going to have to weigh the exposure, the risk and the--I don't know how to put this.
KC: Freedom.
MB: The response. The freedom of people, thank you, of using whatever they want on their desktop. That extends a little bit further because a faculty member might buy an HP work station and put it on his desktop and then get an undergraduate in the computer science school to maintain that. Well, the undergraduate in the computer science school is an undergraduate in the computer science school, and they may or may not know what they're doing when they're maintaining that device. So even suggesting to some of these people that they need to have certified system administrators taking care of those devices, they don't know where to get those. They don't have the wherewithal, the funding, what have you, to administer those things correctly. But they still want to be able to do that. So the initial answer is no, we don't restrict those, and the long answer is, someone probably needs to think long and hard about those. And when they do, have them contact me and tell me how that went.
KC: That's what we're doing right now!
JB: Okay, very good. Thanks, Kerry, and everyone else for all of their questions. I think it is perhaps time to close out our session for today, and I really thank everyone for all of the encouragement for continuing. This is our last session that we had planned for this spring, but we had been thinking about going ahead and offering a few more, so we'll take that into consideration and get back to everyone.
GM: That's right. Your notes and letters can prompt us to do more.
JB: That's right. Should we ask them to send money and we send cups, too?
MB: It depends on what you do with the money.
JB: Pay the encoder, Brian Vaughn, gets all the money.
BS: He deserves it.
JB: But at any rate, do watch the CREN Website for further announcements, and also you can send us a note at CREN, at cren.net if you would like to receive personalized announcements or a subscription to the announcement list for these events. I would like to thank all our experts, Mark Bruhn and our other guest, Brent Sweeney. Also, thank you Greg Marks, and also the team at Merit. Brian Vaughn, the audio encoder from UM online and all of those who joined in, and also the board of CREN, the Corporation for Research in Educational Networking. You were here because it's time.
|