Mercurial > hg > Members > taiki > benchmarks
comparison on_gfs2.txt @ 13:d0e15fec6e50
separate file to gfs2 benchmark from benchmark file.
author | taiki |
---|---|
date | Tue, 13 Jan 2015 06:56:59 +0900 |
parents | 8cfb3d2a1f14 |
children | 88d663a7bad5 |
comparison
equal
deleted
inserted
replaced
12:8cfb3d2a1f14 | 13:d0e15fec6e50 |
---|---|
1 | 1 VM 70.2 |
2 * 2015 1/1 GFS2 / fileserver / 60 seconds / VM to GFS2 only bldsv09 access / VM image on FC | 2 FibreChannel 84.3 |
3 | 3 docker 118.9 |
4 statfile1 16217ops 270ops/s 0.0mb/s 0.2ms/op 2155us/op-cpu [0ms - 428ms] | |
5 deletefile1 16221ops 270ops/s 0.0mb/s 8.4ms/op 6842us/op-cpu [0ms - 2911ms] | |
6 closefile3 16223ops 270ops/s 0.0mb/s 0.0ms/op 2132us/op-cpu [0ms - 19ms] | |
7 readfile1 16225ops 270ops/s 34.2mb/s 23.8ms/op 4463us/op-cpu [0ms - 5577ms] | |
8 openfile2 16237ops 271ops/s 0.0mb/s 0.3ms/op 2175us/op-cpu [0ms - 428ms] | |
9 closefile2 16243ops 271ops/s 0.0mb/s 0.0ms/op 2114us/op-cpu [0ms - 5ms] | |
10 appendfilerand1 16245ops 271ops/s 2.1mb/s 49.7ms/op 8875us/op-cpu [0ms - 5643ms] | |
11 openfile1 16257ops 271ops/s 0.0mb/s 0.3ms/op 2158us/op-cpu [0ms - 425ms] | |
12 closefile1 16258ops 271ops/s 0.0mb/s 0.0ms/op 2133us/op-cpu [0ms - 9ms] | |
13 wrtfile1 16262ops 271ops/s 33.9mb/s 1.5ms/op 2783us/op-cpu [0ms - 1806ms] | |
14 createfile1 16267ops 271ops/s 0.0mb/s 0.9ms/op 2445us/op-cpu [0ms - 428ms] | |
15 1093: 127.732: IO Summary: 178655 ops, 2976.451 ops/s, (270/542 r/w), 70.2mb/s, 457us cpu/op, 28.4ms latency | |
16 | |
17 * 2015 1/1 GFS2 / fileserver / 60 seconds / only bldsv09 access / FC | |
18 | |
19 statfile1 19419ops 324ops/s 0.0mb/s 0.4ms/op 1967us/op-cpu [0ms - 230ms] | |
20 deletefile1 19419ops 324ops/s 0.0mb/s 8.5ms/op 6421us/op-cpu [0ms - 8915ms] | |
21 closefile3 19429ops 324ops/s 0.0mb/s 0.0ms/op 1396us/op-cpu [0ms - 0ms] | |
22 readfile1 19429ops 324ops/s 41.9mb/s 2.9ms/op 6144us/op-cpu [0ms - 8908ms] | |
23 openfile2 19431ops 324ops/s 0.0mb/s 1.1ms/op 2121us/op-cpu [0ms - 8906ms] | |
24 closefile2 19432ops 324ops/s 0.0mb/s 0.0ms/op 1438us/op-cpu [0ms - 0ms] | |
25 appendfilerand1 19432ops 324ops/s 2.5mb/s 8.8ms/op 7417us/op-cpu [0ms - 8912ms] | |
26 openfile1 19435ops 324ops/s 0.0mb/s 3.1ms/op 2399us/op-cpu [0ms - 8907ms] | |
27 closefile1 19436ops 324ops/s 0.0mb/s 0.0ms/op 1338us/op-cpu [0ms - 0ms] | |
28 wrtfile1 19436ops 324ops/s 39.8mb/s 96.0ms/op 120915us/op-cpu [0ms - 9012ms] | |
29 createfile1 19464ops 324ops/s 0.0mb/s 9.7ms/op 13736us/op-cpu [0ms - 7886ms] | |
30 22867: 99.839: IO Summary: 213762 ops, 3562.315 ops/s, (324/648 r/w), 84.3mb/s, 1117us cpu/op, 43.5ms latency | |
31 | |
32 * 2015 1/1 GFS2 / fileserver / 60 seconds / docker to only bldsv09 access / FC | |
33 | |
34 -v option で /media/fcs へ接続 | |
35 | |
36 statfile1 27399ops 457ops/s 0.0mb/s 0.7ms/op 1869us/op-cpu [0ms - 13179ms] | |
37 deletefile1 27381ops 456ops/s 0.0mb/s 3.6ms/op 6066us/op-cpu [0ms - 13194ms] | |
38 closefile3 27400ops 457ops/s 0.0mb/s 0.0ms/op 1378us/op-cpu [0ms - 0ms] | |
39 readfile1 27400ops 457ops/s 59.2mb/s 1.5ms/op 6065us/op-cpu [0ms - 209ms] | |
40 openfile2 27400ops 457ops/s 0.0mb/s 0.3ms/op 2014us/op-cpu [0ms - 99ms] | |
41 closefile2 27400ops 457ops/s 0.0mb/s 0.0ms/op 1400us/op-cpu [0ms - 0ms] | |
42 appendfilerand1 27400ops 457ops/s 3.6mb/s 4.4ms/op 9155us/op-cpu [0ms - 13193ms] | |
43 openfile1 27400ops 457ops/s 0.0mb/s 0.4ms/op 2048us/op-cpu [0ms - 81ms] | |
44 closefile1 27400ops 457ops/s 0.0mb/s 0.0ms/op 1341us/op-cpu [0ms - 1ms] | |
45 wrtfile1 27400ops 457ops/s 56.1mb/s 81.9ms/op 130988us/op-cpu [0ms - 13500ms] | |
46 createfile1 27439ops 457ops/s 0.0mb/s 6.9ms/op 14198us/op-cpu [0ms - 13186ms] | |
47 43: 92.660: IO Summary: 301419 ops, 5023.193 ops/s, (457/913 r/w), 118.9mb/s, 1187us cpu/op, 33.2ms latency | |
48 | |
49 * 2015 1/1 GFS2 / fileserver / 60 seconds / bldsv09 access / FC | |
50 | |
51 statfile1 26112ops 435ops/s 0.0mb/s 0.2ms/op 1669us/op-cpu [0ms - 291ms] | |
52 deletefile1 26111ops 435ops/s 0.0mb/s 4.4ms/op 5851us/op-cpu [0ms - 14536ms] | |
53 closefile3 26114ops 435ops/s 0.0mb/s 0.0ms/op 1187us/op-cpu [0ms - 0ms] | |
54 readfile1 26114ops 435ops/s 56.1mb/s 5.0ms/op 4831us/op-cpu [0ms - 14540ms] | |
55 openfile2 26114ops 435ops/s 0.0mb/s 1.4ms/op 1825us/op-cpu [0ms - 14518ms] | |
56 closefile2 26114ops 435ops/s 0.0mb/s 0.0ms/op 1212us/op-cpu [0ms - 0ms] | |
57 appendfilerand1 26114ops 435ops/s 3.4mb/s 7.2ms/op 8338us/op-cpu [0ms - 14540ms] | |
58 openfile1 26115ops 435ops/s 0.0mb/s 1.7ms/op 1891us/op-cpu [0ms - 14519ms] | |
59 closefile1 26115ops 435ops/s 0.0mb/s 0.0ms/op 1206us/op-cpu [0ms - 0ms] | |
60 wrtfile1 26115ops 435ops/s 54.1mb/s 76.3ms/op 126556us/op-cpu [0ms - 14636ms] | |
61 createfile1 26160ops 436ops/s 0.0mb/s 8.7ms/op 13211us/op-cpu [0ms - 14537ms] | |
62 18999: 86.335: IO Summary: 287298 ops, 4787.839 ops/s, (435/870 r/w), 113.5mb/s, 1127us cpu/op, 35.0ms latency | |
63 | |
64 * 2015 1/6 GFS2 / fileserver / 60 seconds / bldsv10 access / FC | |
65 | |
66 statfile1 23738ops 396ops/s 0.0mb/s 1.4ms/op 2917us/op-cpu [0ms - 13948ms] | |
67 deletefile1 23718ops 395ops/s 0.0mb/s 8.4ms/op 6633us/op-cpu [0ms - 13969ms] | |
68 closefile3 23739ops 396ops/s 0.0mb/s 0.0ms/op 2433us/op-cpu [0ms - 0ms] | |
69 readfile1 23739ops 396ops/s 50.9mb/s 5.4ms/op 10531us/op-cpu [0ms - 14041ms] | |
70 openfile2 23739ops 396ops/s 0.0mb/s 3.8ms/op 2983us/op-cpu [0ms - 13955ms] | |
71 closefile2 23739ops 396ops/s 0.0mb/s 0.0ms/op 2398us/op-cpu [0ms - 0ms] | |
72 appendfilerand1 23740ops 396ops/s 3.1mb/s 7.0ms/op 9821us/op-cpu [0ms - 13969ms] | |
73 openfile1 23742ops 396ops/s 0.0mb/s 1.1ms/op 3156us/op-cpu [0ms - 13949ms] | |
74 closefile1 23742ops 396ops/s 0.0mb/s 0.0ms/op 2358us/op-cpu [0ms - 0ms] | |
75 wrtfile1 23743ops 396ops/s 49.2mb/s 56.8ms/op 114702us/op-cpu [0ms - 14031ms] | |
76 createfile1 23785ops 396ops/s 0.0mb/s 25.1ms/op 16142us/op-cpu [0ms - 13969ms] | |
77 5359: 68.172: IO Summary: 261164 ops, 4352.327 ops/s, (396/791 r/w), 103.2mb/s, 1243us cpu/op, 36.3ms latency | |
78 | |
79 * 2015 1/6 GFS2 / fileserver / 60 seconds / bldsv10 and bldsv09 access / FC | |
80 | |
81 ansible を使った 2 node (bldsv09、bldsv10) から GFS2 上の別ディレクトリへの読み書き | |
82 同じディレクトリへは filebench の性質上できなかった | |
83 | |
84 ansible -s -i hosts all -a 'filebench -f /home/taira/hg/benchmarks/fileserver.f' --sudo --ask-sudo-pass | |
85 | |
86 bldsv09.cr.ie.u-ryukyu.ac.jp | |
87 /media/fcs/bldsv09 | |
88 | |
89 statfile1 2832ops 47ops/s 0.0mb/s 15.2ms/op 5290us/op-cpu [0ms - 5189ms] | |
90 deletefile1 2830ops 47ops/s 0.0mb/s 40.4ms/op 7021us/op-cpu [0ms - 17584ms] | |
91 closefile3 2839ops 47ops/s 0.0mb/s 0.0ms/op 1828us/op-cpu [0ms - 0ms] | |
92 readfile1 2839ops 47ops/s 6.0mb/s 4.8ms/op 2853us/op-cpu [0ms - 1642ms] | |
93 openfile2 2847ops 47ops/s 0.0mb/s 120.9ms/op 7457us/op-cpu [0ms - 19056ms] | |
94 closefile2 2852ops 48ops/s 0.0mb/s 0.0ms/op 2002us/op-cpu [0ms - 0ms] | |
95 appendfilerand1 2852ops 48ops/s 0.4mb/s 40.8ms/op 14246us/op-cpu [0ms - 9765ms] | |
96 openfile1 2866ops 48ops/s 0.0mb/s 239.1ms/op 12955us/op-cpu [0ms - 19058ms] | |
97 closefile1 2867ops 48ops/s 0.0mb/s 0.0ms/op 2065us/op-cpu [0ms - 0ms] | |
98 wrtfile1 2867ops 48ops/s 5.9mb/s 31.5ms/op 12675us/op-cpu [0ms - 5644ms] | |
99 createfile1 2879ops 48ops/s 0.0mb/s 452.5ms/op 76766us/op-cpu [0ms - 28904ms] | |
100 4400: 65.803: IO Summary: 31370 ops, 522.771 ops/s, (47/95 r/w), 12.2mb/s, 1526us cpu/op, 316.6ms latency | |
101 | |
102 bldsv10.cr.ie.u-ryukyu.ac.jp | |
103 /media/fcs/bldsv10 | |
104 | |
105 statfile1 4912ops 82ops/s 0.0mb/s 11.9ms/op 2079us/op-cpu [0ms - 4671ms] | |
106 deletefile1 4891ops 82ops/s 0.0mb/s 71.6ms/op 12556us/op-cpu [0ms - 13723ms] | |
107 closefile3 4919ops 82ops/s 0.0mb/s 0.0ms/op 1386us/op-cpu [0ms - 0ms] | |
108 readfile1 4919ops 82ops/s 10.3mb/s 3.1ms/op 2007us/op-cpu [0ms - 6064ms] | |
109 openfile2 4922ops 82ops/s 0.0mb/s 16.0ms/op 2292us/op-cpu [0ms - 4709ms] | |
110 closefile2 4926ops 82ops/s 0.0mb/s 0.0ms/op 1303us/op-cpu [0ms - 0ms] | |
111 appendfilerand1 4926ops 82ops/s 0.6mb/s 28.7ms/op 11045us/op-cpu [0ms - 7014ms] | |
112 openfile1 4933ops 82ops/s 0.0mb/s 93.4ms/op 2783us/op-cpu [0ms - 15341ms] | |
113 closefile1 4936ops 82ops/s 0.0mb/s 0.0ms/op 1264us/op-cpu [0ms - 0ms] | |
114 wrtfile1 4936ops 82ops/s 10.2mb/s 99.5ms/op 75415us/op-cpu [0ms - 10896ms] | |
115 createfile1 4938ops 82ops/s 0.0mb/s 220.4ms/op 37436us/op-cpu [0ms - 16501ms] | |
116 18603: 65.666: IO Summary: 54158 ops, 902.538 ops/s, (82/164 r/w), 21.1mb/s, 1473us cpu/op, 181.6ms latency | |
117 | |
118 * 2014 1/12 GFS2 / fileserver / 60 seconds / bldsv10 access / FC | |
119 | |
120 http://jpetazzo.github.io/2014/01/29/docker-device-mapper-resize/ を参考に、devicemapper が作る pool を FC 上に変更 | |
121 | |
122 1. Stop the Docker daemon. | |
123 2. Wipe out /var/lib/docker. (That should sound familiar, right?) | |
124 3. Create the storage directory: mkdir -p /var/lib/docker/devicemapper/devicemapper. | |
125 4. Create a data symbolic link in that directory, pointing to the device: ln -s /dev/sdb /var/lib/docker/devicemapper/devicemapper/data. | |
126 5. Restart Docker. | |
127 6. Check with docker info that the Data Space Total value is correct. | |
128 | |
129 docker run --privileged -it fedora:20 /bin/bash で docker を実行し filebench を install | |
130 | |
131 statfile1 41866ops 698ops/s 0.0mb/s 0.0ms/op 6235us/op-cpu [0ms - 1ms] | |
132 deletefile1 41869ops 698ops/s 0.0mb/s 0.2ms/op 6971us/op-cpu [0ms - 218ms] | |
133 closefile3 41874ops 698ops/s 0.0mb/s 0.0ms/op 6054us/op-cpu [0ms - 2ms] | |
134 readfile1 41881ops 698ops/s 91.0mb/s 8.8ms/op 29662us/op-cpu [0ms - 266ms] | |
135 openfile2 41886ops 698ops/s 0.0mb/s 0.1ms/op 6587us/op-cpu [0ms - 13ms] | |
136 closefile2 41890ops 698ops/s 0.0mb/s 0.0ms/op 6085us/op-cpu [0ms - 2ms] | |
137 appendfilerand1 41895ops 698ops/s 5.4mb/s 2.5ms/op 12903us/op-cpu [0ms - 14ms] | |
138 openfile1 41900ops 698ops/s 0.0mb/s 0.1ms/op 6640us/op-cpu [0ms - 9ms] | |
139 closefile1 41904ops 698ops/s 0.0mb/s 0.0ms/op 6122us/op-cpu [0ms - 2ms] | |
140 wrtfile1 41910ops 698ops/s 86.5mb/s 8.5ms/op 28960us/op-cpu [0ms - 272ms] | |
141 createfile1 41914ops 699ops/s 0.0mb/s 0.2ms/op 7029us/op-cpu [0ms - 15ms] | |
142 35: 76.024: IO Summary: 460789 ops, 7679.141 ops/s, (698/1397 r/w), 182.9mb/s, 1022us cpu/op, 6.8ms latency |