© Distribution of this video is restricted by its owner
00:08 | Welcome to my talk. I'm Stephen . My area of research is in |
|
|
00:15 | . I'll be sharing with you Saw my work in intrusion detection. |
|
|
00:22 | start with a discussion of cyber defense . Thank you. Let's look at |
|
|
00:34 | very important elements of cyber defense. first one. Well these bars are |
|
|
00:43 | defense and I'm going to show you at a time. The first defense |
|
|
00:48 | to deter intruders malicious actors to come our system. As you can see |
|
|
00:57 | we can deter them, then they abandon and they will not even try |
|
|
01:02 | come into our system. Now this include some of the methods like the |
|
|
01:08 | very secure password and encrypting documents and on. So if the cost of |
|
|
01:17 | high for the intruder they will, lot of them will give up the |
|
|
01:24 | but unavoidably some of them will get . And then that that's where the |
|
|
01:31 | layer of the defenses which is what call protect. Alright, so if |
|
|
01:37 | can protect our system then some of will not be able to come |
|
|
01:40 | This is things like fire war and protection software. So uh but again |
|
|
01:51 | of them will continue to break into system. So what's what are we |
|
|
02:00 | to do next? The next is they try to come into our system |
|
|
02:04 | will try to detect them if we them will kick in now not allowing |
|
|
02:11 | to continue to uh to stay in system. So uh that that is |
|
|
02:20 | good but still a very small fraction them is going to come in to |
|
|
02:25 | system and that's what the respond Will respond by maybe undoing some of |
|
|
02:35 | damages of the intruder and so So if we can properly respond to |
|
|
02:43 | malicious actor then that's probably all But unavoidably some of them were coming |
|
|
02:52 | without, even though we have these layers of detection and protection, but |
|
|
03:02 | that kind of bring up to the trust architecture that is quite popular |
|
|
03:10 | We don't have time to get into but you can see that our efforts |
|
|
03:15 | mainly be in detection because that's that's important. And our will be in |
|
|
03:25 | next two sections will be talking about network intrusion detection and then some of |
|
|
03:31 | additional scene that we have been So if you look at the computer |
|
|
03:38 | that we're trying to protect on the and the adversary client on the |
|
|
03:43 | how do they get into our Well, there's this nice thing called |
|
|
03:48 | shell that allows people to remotely log to a server. And this is |
|
|
03:55 | for system administrator for example, you , you you don't have to be |
|
|
04:00 | at the machine, you will be to manage a lot of machines |
|
|
04:05 | But the they so ah in this the adversary will not be doing this |
|
|
04:17 | if there's a direct connection, then will be exchange of I. |
|
|
04:20 | Address between the two machines and it's easy for us to detect their |
|
|
04:27 | So typically they will have they will their identities in behind this. Not |
|
|
04:38 | . E network this is not limited is we'll show you some examples later |
|
|
04:44 | . But the purpose of these are protect the identity of the client for |
|
|
04:49 | reason for privacy and and other But hackers, adversaries can uh use |
|
|
04:59 | to hide their identity and again to system so we will not be able |
|
|
05:05 | identify them so easily. So let's at several examples. The first example |
|
|
05:11 | stepping stone network as you can see is how we can kind of build |
|
|
05:18 | own network in this network. We two hosts H one H. |
|
|
05:24 | And decline will connect to the the one first and then will connect to |
|
|
05:35 | two. Using secure shell. And finally connect to the target server. |
|
|
05:43 | can see that at the server that's our protection is going to be. |
|
|
05:48 | can only see H two not H and the client. So typically we |
|
|
05:56 | want to have three hubs in the this route in order to hide their |
|
|
06:04 | . So no one not a single in here will have both the client |
|
|
06:11 | . P. Address and the server . P address. All right. |
|
|
06:15 | if you only have one holes in , then that holds will hold both |
|
|
06:19 | . P. Address. And that's little bit dangerous. So that's one |
|
|
06:24 | of of setting up a network to their identity. But you don't really |
|
|
06:33 | to do that. Alright. Setting that kind of network require you to |
|
|
06:38 | some of the machines to have access them. There are actually a lot |
|
|
06:43 | nice services available out there again for privacy is a very good reason to |
|
|
06:53 | so but it can be abused. . So the next one we're showing |
|
|
06:58 | this proxy server which a lot of are, a lot of them are |
|
|
07:05 | for free. Again you pretty much the similar thing, you connect to |
|
|
07:11 | first proxy, you connect to the proxy and then you connect to the |
|
|
07:14 | server and again three hubs allow you kind of protect it pretty well. |
|
|
07:22 | next example you probably have heard of tour that's a network that already |
|
|
07:28 | A lot of machine people are donating to that. Ah So and again |
|
|
07:37 | you will connect to three machines. first one is typically called the entry |
|
|
07:43 | and then the intermediate nodes then there be an exit node. Then you |
|
|
07:49 | go to the target. Alright. this is actually kind of like a |
|
|
07:53 | hub uh connection and again the server knows the package coming from here but |
|
|
08:05 | real client is actually sitting there. and the fourth one. Again, |
|
|
08:12 | of you are probably using is to the VPN virtual private network. |
|
|
08:17 | the university do provide VPN for you connect to the university so that you |
|
|
08:25 | arm campus instead of at home. in this case this is this is |
|
|
08:32 | of two hubs instead of three So um so that's summarize, summarize |
|
|
08:42 | . Alright. We've seen for examples they all look like this is the |
|
|
08:47 | that we proposed that the limited network have actually know which is exposed to |
|
|
08:56 | server that we're trying to get into that's the only thing that the server |
|
|
09:02 | and and everything else is hidden So what should we do if this |
|
|
09:09 | the case? Well, the real is we want to be able to |
|
|
09:15 | between the two. Do we have adversary coming in? Well, I |
|
|
09:21 | shouldn't say adversary a user, a that connect to our target directly, |
|
|
09:27 | is nice. Safe and so At least we don't worry about it |
|
|
09:32 | much. We can identify that or hidden under this. Alright. So |
|
|
09:39 | job is to be able to tell apart and would a lot of our |
|
|
09:44 | involved using machine learning algorithm to identify . So the the tax scenario is |
|
|
09:53 | adversary hiding behind these networks trying to their identity and our job is to |
|
|
10:03 | able to identify an intruder versus normal . So typical approach we analyze data |
|
|
10:12 | there are two kinds of packets that can use once the data packet. |
|
|
10:16 | the normal secure social thing that the is is either viewing the file or |
|
|
10:24 | . The other. The second thing the protocol packet because in order to |
|
|
10:28 | up these things, you need the way handshaking and you need secure social |
|
|
10:33 | exchanges and so on to set up encryption and so on. So these |
|
|
10:39 | the two typical approach that we use to give you again how difficult this |
|
|
10:47 | . I'm using a stepping stone as example. You can see that the |
|
|
10:54 | here, the server here, you only see a bunch of holes connected |
|
|
11:00 | you directly. But who knows, may be an emissary hidden somewhere and |
|
|
11:08 | into it. So but those outside box, you cannot see them. |
|
|
11:14 | how are you going to be able do by only analyzing the connection that |
|
|
11:20 | to us directly? And that's pretty . Alright, so I'm not going |
|
|
11:27 | give you solutions, we don't have for that. The next section. |
|
|
11:31 | about host intrusion detection that's not add network boundary but inside the machine, |
|
|
11:39 | we have somebody intruded into our can we detect them there. So |
|
|
11:51 | sometimes, you know, in my classification this is also classified as a |
|
|
11:59 | to to the attack. Alright. more more or less. I'm using |
|
|
12:05 | intrusion detection. The scenario is an is already inside the host coming in |
|
|
12:13 | secure shell and we want to identify before they can do any damage. |
|
|
12:19 | , So the the the approaches that we are modeling the user behavior because |
|
|
12:27 | our assumption that uh it's our hypothesis the intruder and a normal user, |
|
|
12:36 | typically do things slightly differently. You imagine intruder probably want to scan a |
|
|
12:44 | of files in your system and try extract information from our system. So |
|
|
12:54 | is how typically tech is will go the file system. On the other |
|
|
13:01 | , a normal user will probably be a couple of files and jump back |
|
|
13:08 | forth. You know, it's more you're editing program and then running it |
|
|
13:15 | then come back and debugging it and on. So so we're trying to |
|
|
13:21 | them by used separating there behavior. . So in order to do |
|
|
13:28 | we need to capture the behavior and graph is a great model for us |
|
|
13:36 | do that. So that's another thing we are doing. Third thing that |
|
|
13:43 | doing is malware classification and detection. . This sort of in the host |
|
|
13:50 | area. Host intrusion detection, because the scenario is malware get into our |
|
|
13:58 | somehow. And our objective is again classify the malware. Well, our |
|
|
14:06 | objective is to detect them. All . But before that we we haven't |
|
|
14:11 | not there yet, but we want classify malware into similar variants. It's |
|
|
14:17 | much like our virus going on here . There are so many mall wears |
|
|
14:24 | a lot of them are The detection detection method is by capturing a signature |
|
|
14:32 | and of the malware but the to that they can make the malware slightly |
|
|
14:42 | . Then the signature will not work there's there's some change to the |
|
|
14:48 | So um so it turns out that are a lot of malware functions that |
|
|
14:54 | do the same thing but they look different to the outside world. So |
|
|
15:00 | we want to do first is to all these malware variants together into |
|
|
15:06 | So how do we classify them into ? How good can we do |
|
|
15:11 | And that's what we've been doing. is a simple experiment that we |
|
|
15:17 | We have like 12 or 13 families we classify them using Grand model and |
|
|
15:28 | we have to extract features from which isn't really a very easy |
|
|
15:35 | So that's pretty much what I want talk about in the my research And |
|
|
15:43 | you are interested in these areas, would recommend these the following four |
|
|
15:51 | the algorithm course, security course, learning course and the network course And |
|
|
15:57 | will be very important for your Alright, so I'll skip some of |
|
|
16:06 | pages. Um and I just want let you know that if you're interested |
|
|
16:16 | annual of these things that I'm talking , feel free to make appointment to |
|
|
16:21 | me coming and have a talk is email address, and I'm sure you |
|
|
16:27 | have too much trouble finding me in the department. All right, nice |
|
|
16:34 | a chance to talk to you, , and um, and will come |
|
|
16:39 | of the new students to the |
|