| Introduction |
|
1 | (1) |
|
|
|
2 | (6) |
|
|
|
3 | (1) |
|
|
|
4 | (1) |
|
|
|
5 | (1) |
|
The Testing Methodology in Context |
|
|
6 | (1) |
|
|
|
6 | (1) |
|
|
|
7 | (1) |
|
|
|
8 | (1) |
|
|
|
8 | (1) |
|
|
|
9 | (2) |
|
|
|
11 | (24) |
|
|
|
12 | (1) |
|
Establishing Performance Criteria |
|
|
12 | (1) |
|
Simulating Application Usage |
|
|
13 | (8) |
|
|
|
13 | (1) |
|
|
|
14 | (2) |
|
|
|
16 | (1) |
|
|
|
16 | (1) |
|
|
|
17 | (1) |
|
Using the Real Think Time |
|
|
18 | (2) |
|
|
|
20 | (1) |
|
|
|
21 | (2) |
|
|
|
21 | (1) |
|
|
|
22 | (1) |
|
|
|
22 | (1) |
|
|
|
23 | (7) |
|
|
|
23 | (2) |
|
Average Response Time (ART) |
|
|
25 | (1) |
|
Aggregate Average Response Time |
|
|
26 | (1) |
|
Maximum Average Response Time |
|
|
26 | (1) |
|
|
|
27 | (1) |
|
Assessing the Accuracy of Test Results |
|
|
28 | (1) |
|
|
|
28 | (2) |
|
|
|
30 | (2) |
|
|
|
30 | (1) |
|
|
|
30 | (1) |
|
Test Environment Optimizations |
|
|
31 | (1) |
|
Single Instance Stress Tests |
|
|
31 | (1) |
|
|
|
31 | (1) |
|
|
|
32 | (1) |
|
|
|
32 | (3) |
|
|
|
35 | (52) |
|
Where to Obtain the Grinder |
|
|
36 | (1) |
|
An Overview of the Grinder |
|
|
36 | (6) |
|
|
|
37 | (1) |
|
|
|
38 | (2) |
|
|
|
40 | (1) |
|
|
|
40 | (2) |
|
|
|
42 | (3) |
|
Using the Grinder Console |
|
|
45 | (6) |
|
|
|
45 | (1) |
|
|
|
46 | (2) |
|
|
|
48 | (3) |
|
The Console Recording Model |
|
|
51 | (1) |
|
|
|
51 | (9) |
|
|
|
51 | (1) |
|
|
|
52 | (1) |
|
Modeling a Web Browser Session |
|
|
53 | (1) |
|
|
|
53 | (1) |
|
|
|
54 | (2) |
|
|
|
56 | (2) |
|
Miscellaneous HTTP Plug-in Properties |
|
|
58 | (1) |
|
|
|
58 | (1) |
|
|
|
59 | (1) |
|
Writing a Grinder Plug-in |
|
|
60 | (12) |
|
|
|
61 | (1) |
|
|
|
62 | (1) |
|
JMS Plug-in Per-test Properties |
|
|
62 | (1) |
|
|
|
63 | (1) |
|
|
|
63 | (2) |
|
|
|
65 | (1) |
|
|
|
65 | (1) |
|
The JMS Queue Sender Plug-in |
|
|
66 | (5) |
|
|
|
71 | (1) |
|
|
|
72 | (3) |
|
Properties That Control Timing |
|
|
73 | (1) |
|
|
|
74 | (1) |
|
Using the TCP Sniffer to Create Test Scripts |
|
|
75 | (8) |
|
|
|
76 | (1) |
|
|
|
76 | (5) |
|
The TCP Sniffer as a Debugging Tool |
|
|
81 | (1) |
|
Installing the TCP Sniffer as a Browser Proxy |
|
|
81 | (2) |
|
|
|
83 | (1) |
|
Use a Shared Disk for grinder properties |
|
|
83 | (1) |
|
|
|
83 | (1) |
|
|
|
83 | (1) |
|
The History and Future of the Grinder |
|
|
83 | (2) |
|
|
|
84 | (1) |
|
|
|
85 | (2) |
|
|
|
87 | (78) |
|
Choosing a JVM for the lava Pet Store |
|
|
88 | (6) |
|
|
|
94 | (1) |
|
|
|
94 | (60) |
|
|
|
95 | (2) |
|
|
|
97 | (1) |
|
Defining Performance Metrics |
|
|
98 | (1) |
|
|
|
98 | (1) |
|
|
|
99 | (3) |
|
|
|
102 | (1) |
|
|
|
102 | (1) |
|
|
|
102 | (3) |
|
The Registered Customer Test Script |
|
|
105 | (3) |
|
The New Customer Test Script |
|
|
108 | (2) |
|
|
|
110 | (1) |
|
Selecting the Sampling Method |
|
|
111 | (1) |
|
Setting up and Running the Tests |
|
|
112 | (1) |
|
Setting up the Database Schema |
|
|
112 | (1) |
|
Configuring and Starting WebLogic Server |
|
|
113 | (2) |
|
|
|
115 | (3) |
|
|
|
118 | (1) |
|
|
|
119 | (1) |
|
|
|
119 | (2) |
|
|
|
121 | (1) |
|
A Note on WebLogic Execute Threads |
|
|
122 | (3) |
|
Exploring the Limit Case 123 |
|
|
|
|
|
|
|
|
125 | (1) |
|
|
|
125 | (1) |
|
JDK 1.3.1-b24 HotSpot Client |
|
|
125 | (3) |
|
JDK 1.3.1-b24 HotSpot Server |
|
|
128 | (2) |
|
|
|
130 | (1) |
|
|
|
131 | (1) |
|
Single Instance Stress Test |
|
|
131 | (1) |
|
100 Users (Baseline Case) |
|
|
131 | (2) |
|
|
|
133 | (1) |
|
|
|
134 | (1) |
|
|
|
135 | (3) |
|
Beyond the Limit - 440 Users |
|
|
138 | (2) |
|
|
|
140 | (2) |
|
Using JDK 1.3.1-b24 HotSpot Server |
|
|
142 | (1) |
|
Optimizing the Number of Execute Threads |
|
|
143 | (1) |
|
Improving the Performance of E-Pizza |
|
|
144 | (1) |
|
|
|
144 | (1) |
|
|
|
145 | (1) |
|
Using SQL_TRACE and TKPROF with e-pizza |
|
|
146 | (1) |
|
|
|
147 | (1) |
|
Parsing and Executing Statements |
|
|
148 | (1) |
|
The Commit of the Pizza Order |
|
|
149 | (3) |
|
Other Database-Related Tuning Suggestions |
|
|
152 | (1) |
|
Application Code and Architecture Refactoring |
|
|
152 | (1) |
|
Consistent Use of the Session Facade Pattern |
|
|
152 | (1) |
|
|
|
153 | (1) |
|
|
|
154 | (1) |
|
Performance Tests Using a WebLogic Cluster |
|
|
154 | (9) |
|
|
|
156 | (1) |
|
|
|
156 | (2) |
|
|
|
158 | (2) |
|
|
|
160 | (1) |
|
|
|
160 | (3) |
|
|
|
163 | (2) |
|
|
|
165 | (80) |
|
|
|
166 | (1) |
|
Designing Servlets for Performance |
|
|
167 | (4) |
|
Using the init() Method for Caching |
|
|
167 | (1) |
|
Choosing a Session Mechanism |
|
|
168 | (1) |
|
Managing the Servlet Thread Pool |
|
|
169 | (1) |
|
|
|
170 | (1) |
|
Disabling Automatic Reloading of Servlets |
|
|
170 | (1) |
|
Test Servlets and Test Scripts |
|
|
171 | (3) |
|
Defining Performance Metrics |
|
|
172 | (1) |
|
|
|
173 | (1) |
|
Throughput: Transactions Per Second (TPS) |
|
|
173 | (1) |
|
Selecting the Sampling Method |
|
|
173 | (1) |
|
|
|
174 | (1) |
|
Setting Up and Running the Tests |
|
|
175 | (6) |
|
Configuring and Starting WebLogic Server |
|
|
175 | (2) |
|
|
|
177 | (1) |
|
|
|
178 | (1) |
|
|
|
179 | (2) |
|
|
|
181 | (15) |
|
The RandomBytesServlet Test Servlet |
|
|
181 | (4) |
|
An Initial Test with Think Times |
|
|
185 | (4) |
|
The Baseline Case - No Think Time |
|
|
189 | (2) |
|
Test Environment Optimizations |
|
|
191 | (1) |
|
|
|
191 | (1) |
|
|
|
192 | (2) |
|
WebLogic Server Execute Threads |
|
|
194 | (1) |
|
WebLogic Performance Pack |
|
|
195 | (1) |
|
|
|
196 | (31) |
|
The Cost of Maintaining a HTTP Log |
|
|
197 | (2) |
|
The Effect of the Response Size |
|
|
199 | (2) |
|
Different HTTP Protocol Options |
|
|
201 | (3) |
|
Using the Servlet Compression Filter |
|
|
204 | (1) |
|
|
|
204 | (2) |
|
|
|
206 | (2) |
|
|
|
208 | (1) |
|
|
|
209 | (1) |
|
|
|
209 | (6) |
|
|
|
215 | (2) |
|
|
|
217 | (1) |
|
The Effect of Session Object Size |
|
|
218 | (1) |
|
HTTP Session Object vs Stateful EJB |
|
|
219 | (1) |
|
|
|
219 | (4) |
|
|
|
223 | (2) |
|
|
|
225 | (2) |
|
|
|
227 | (1) |
|
|
|
227 | (14) |
|
|
|
228 | (1) |
|
|
|
229 | (1) |
|
Setting up and Running the Tests |
|
|
230 | (1) |
|
Setting up the Database Schema |
|
|
230 | (1) |
|
Configuring and Starting WebLogic Server |
|
|
230 | (2) |
|
Selecting the Replication Technique |
|
|
232 | (1) |
|
|
|
233 | (1) |
|
|
|
234 | (1) |
|
|
|
235 | (2) |
|
Comparing File and Database Persistence |
|
|
237 | (3) |
|
|
|
240 | (1) |
|
|
|
241 | (1) |
|
|
|
241 | (4) |
|
|
|
245 | (92) |
|
|
|
246 | (4) |
|
|
|
246 | (1) |
|
EJB Design Pattern Sources |
|
|
247 | (1) |
|
Three Fundamental Patterns |
|
|
248 | (1) |
|
Why Evaluate EJB Design Patterns and Performance? |
|
|
249 | (1) |
|
Better EJB Design through Testing |
|
|
250 | (1) |
|
|
|
250 | (8) |
|
|
|
251 | (1) |
|
|
|
252 | (2) |
|
Running the DispatcherServlet Manually |
|
|
254 | (4) |
|
|
|
258 | (1) |
|
|
|
259 | (1) |
|
Setting up and Running the Tests |
|
|
260 | (1) |
|
Testing the Session Facade Pattern |
|
|
261 | (10) |
|
The Session Facade Pattern in a Nutshell |
|
|
261 | (2) |
|
Test Case 5.1.1: FacadeOffTest |
|
|
263 | (1) |
|
|
|
264 | (2) |
|
Test Case 5.1.2: FacadeOnTest |
|
|
266 | (1) |
|
Local or Remote Interfaces? |
|
|
267 | (2) |
|
Using the Data Transfer HashMap Pattern |
|
|
269 | (1) |
|
|
|
270 | (1) |
|
Testing the Value Object Pattern |
|
|
271 | (22) |
|
The Value Object Pattern in a Nutshell |
|
|
272 | (2) |
|
Test Case 5.2.1: VoOffTest |
|
|
274 | (1) |
|
|
|
275 | (2) |
|
Test Case 5.2.2: VoOnTest |
|
|
277 | (1) |
|
|
|
278 | (2) |
|
More Interesting Value Object Tests |
|
|
280 | (1) |
|
Test Case 5.2.3: VoTxTest |
|
|
281 | (2) |
|
|
|
283 | (5) |
|
Test Case 5.2.4: VoFacadeTest |
|
|
288 | (1) |
|
|
|
289 | (4) |
|
Testing the Data Access Object Pattern |
|
|
293 | (20) |
|
The Data Access Object Pattern in a Nutshell |
|
|
295 | (2) |
|
Test Case 5.3.1: DaoOffTest |
|
|
297 | (1) |
|
|
|
298 | (2) |
|
Test Case 5.3.2: DaoOnTest |
|
|
300 | (1) |
|
|
|
301 | (2) |
|
More Interesting Data Access Object Tests |
|
|
303 | (1) |
|
Test Case 5.3.3: DaoBmpTest |
|
|
304 | (3) |
|
|
|
307 | (1) |
|
Test Case 5.3.4: DaoCmrTest |
|
|
308 | (1) |
|
|
|
309 | (1) |
|
Test Case 5.3.5: DaoSbTest |
|
|
310 | (2) |
|
|
|
312 | (1) |
|
EJB Deployment Configuration Testing Strategy |
|
|
313 | (21) |
|
Testing EJB Deployment Alternatives |
|
|
318 | (2) |
|
The DeploymentTest Script |
|
|
320 | (1) |
|
Transaction Isolation and EJB Performance |
|
|
321 | (2) |
|
Test Case 5.4.1: DdlsoTSTest |
|
|
323 | (1) |
|
|
|
323 | (1) |
|
Test Case 5.4.2: DdlsoTRCTest |
|
|
324 | (1) |
|
|
|
325 | (2) |
|
Access Control and EJB Performance |
|
|
327 | (1) |
|
Test Case 5.5.1: DdSecOffTest |
|
|
327 | (1) |
|
|
|
328 | (2) |
|
Test Case 5.5.2: DdSecOnTest |
|
|
330 | (1) |
|
|
|
331 | (3) |
|
|
|
334 | (3) |
|
JMS Point-to-Point Messaging |
|
|
337 | (58) |
|
An Overview of JMS Point-to-Point Messaging |
|
|
338 | (3) |
|
|
|
339 | (2) |
|
|
|
341 | (1) |
|
JMS PTP Performance Issues |
|
|
342 | (2) |
|
|
|
343 | (1) |
|
|
|
343 | (1) |
|
|
|
344 | (1) |
|
|
|
345 | (1) |
|
The Grinder Plug-Ins and Test Scripts |
|
|
345 | (3) |
|
|
|
346 | (1) |
|
|
|
347 | (1) |
|
Setting up and Running the Tests |
|
|
348 | (8) |
|
Setting up the Persistent Store |
|
|
348 | (1) |
|
Configuring and Starting WebLogic Server |
|
|
349 | (2) |
|
Setting the Type of Persistence |
|
|
351 | (2) |
|
|
|
353 | (2) |
|
Collecting and Analyzing the Data |
|
|
355 | (1) |
|
|
|
356 | (29) |
|
|
|
356 | (1) |
|
The Baseline Case (l6KB Messages, 100 Consumers) |
|
|
357 | (1) |
|
Test Environment Optimizations |
|
|
358 | (1) |
|
|
|
359 | (1) |
|
Tests with 16 KB Messages |
|
|
360 | (1) |
|
Investigating Acknowledgment |
|
|
360 | (6) |
|
|
|
366 | (5) |
|
Using Database persistence |
|
|
371 | (4) |
|
|
|
375 | (1) |
|
Investigating Acknowledgment |
|
|
376 | (4) |
|
|
|
380 | (3) |
|
Using Database Persistence |
|
|
383 | (2) |
|
Overall Conclusions for JMS Fan-0ut |
|
|
385 | (3) |
|
The Effect of Message Size |
|
|
385 | (1) |
|
|
|
385 | (1) |
|
|
|
386 | (2) |
|
|
|
388 | (4) |
|
Test Case 6.19 - 2 Queues, 16KB Messages |
|
|
388 | (2) |
|
Test Case 6.20 - 2 Queues, 1KB Messages |
|
|
390 | (1) |
|
Test Case 6.21 - 32 Queues, l6KB Messages |
|
|
391 | (1) |
|
Test Case 6.10 - 32 Queues, 1KB Messages |
|
|
391 | (1) |
|
|
|
392 | (1) |
|
|
|
392 | (3) |
|
JMS Publish/Subscribe Messaging |
|
|
395 | (34) |
|
An Overview of Pub/Sub Messaging |
|
|
396 | (3) |
|
|
|
397 | (1) |
|
|
|
398 | (1) |
|
|
|
399 | (1) |
|
The Grinder Plug-Ins and Test Scripts |
|
|
400 | (6) |
|
Publisher Plug-in Parameters |
|
|
402 | (1) |
|
Subscriber Plug-in Parameters |
|
|
402 | (2) |
|
|
|
404 | (1) |
|
|
|
405 | (1) |
|
The Stock Ticker Application |
|
|
406 | (9) |
|
|
|
406 | (1) |
|
|
|
407 | (1) |
|
Test Environment Optimizations |
|
|
408 | (3) |
|
|
|
411 | (1) |
|
No Acknowledgment versus Multicast |
|
|
411 | (4) |
|
|
|
415 | (1) |
|
The Airline Seating Example |
|
|
416 | (11) |
|
|
|
416 | (1) |
|
|
|
417 | (1) |
|
Investigating Acknowledgment |
|
|
417 | (3) |
|
File Persistence with Synchronous Writes |
|
|
420 | (2) |
|
File Persistence with Non-Synchronous Writes |
|
|
422 | (1) |
|
|
|
422 | (1) |
|
|
|
423 | (1) |
|
|
|
424 | (1) |
|
|
|
425 | (1) |
|
|
|
425 | (1) |
|
|
|
426 | (1) |
|
|
|
427 | (2) |
| Appendix A: The Grinder Reference |
|
429 | (12) |
|
|
|
429 | (4) |
|
|
|
433 | (2) |
|
|
|
433 | (1) |
|
|
|
434 | (1) |
|
|
|
435 | (6) |
|
|
|
435 | (1) |
|
|
|
436 | (1) |
|
|
|
436 | (1) |
|
|
|
437 | (4) |
| Appendix B: Hardware and Software |
|
441 | (4) |
|
|
|
441 | (1) |
|
|
|
442 | (3) |
| Appendix C: Comparing ECperf and the Testing Methodology |
|
445 | (2) |
|
Definition and Architecture |
|
|
447 | (3) |
|
|
|
447 | (2) |
|
|
|
449 | |