blob: edc1ec9c6deb4464ca6c4fa8c932b0045bc9545a (
plain) (
blame)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
|
RUNNING THE WEB SERVER
============================================================
Extract "testing.zip" and use it as your webserver root
directory. Run your webserver using the following command
./web_server <port> <path_to_testing>/testing <num_dispatch> <num_worker> <dynamic_flag> <queue_len> <cache_entries>
**** Pick a random number other than 9000 from (1024 to 65536) to avoid collisions with other groups ****
For example, to run the web server at port 9000, with root directory "/home/student/joe/testing" with 100 dispatch and worker threads,
queue length 100 run the following command
When you didn't implement dynamic worker thread pool nor caching
./web_server 9000 /home/student/joe/testing 100 100 0 100 0
When you implemented dynamic worker thread pool, but didn't implement caching
./web_server 9000 /home/student/joe/testing 100 100 1 100 0
When you didn't implement dynamic worker thread pool, but implemented caching
./web_server 9000 /home/student/joe/testing 100 100 0 100 100
When you implemented both dynamic worker thread pool and caching
./web_server 9000 /home/student/joe/testing 100 100 1 100 100
You should now (using another terminal) be able to query a single file, such as the following:
wget http://127.0.0.1:9000/image/jpg/29.jpg
If you have a file containing all the URLs you want, you can open a terminal and issue the following command,
wget -i <path-to-urls>/urls
The above command will ask wget to fetch all the URLs listed on the file named "urls".
To run the xargs test: (Refer project description for more details)
cat <path _to_urls_file> | xargs -n 1 -P 8 wget
|