J2EE Performance Testing Using BEA Weblogic Server

by
Format: Paperback
Pub. Date: 2002-06-01
Publisher(s): Springer-Verlag New York Inc
List Price: $49.99

Rent Book

Select for Price
There was a problem. Please try again later.

New Book

We're Sorry
Sold Out

Used Book

We're Sorry
Sold Out

eBook

We're Sorry
Not Available

How Marketplace Works:

  • This item is offered by an independent seller and not shipped from our warehouse
  • Item details like edition and cover design may differ from our description; see seller's comments before ordering.
  • Sellers much confirm and ship within two business days; otherwise, the order will be cancelled and refunded.
  • Marketplace purchases cannot be returned to eCampus.com. Contact the seller directly for inquiries; if no response within two days, contact customer service.
  • Additional shipping costs apply to Marketplace purchases. Review shipping costs at checkout.

Summary

Explains how to evaluate the performance of complete J2EE applications, explores performance issues of popular J2EE APIs, provides a methodology, and offers a benchmark for judging J2EE performance.

Author Biography

Peter is the Chief Technologist for BEA Systems in Europe, Middle East and Africa, and is an expert in the field of J2EE performance testing. He spends a lot of his time meeting with prospects and customers, helping them to design and fine tune WebLogic Server-based solutions to meet their business challenges. Drawing on a wealth of experience and customer feedback, Peter presents a powerful toolset and methodology that will allow developers to performance test their J2EE applications.

Table of Contents

Introduction 1(1)
J2EE Performance Testing
2(6)
What is Performance?
3(1)
Interactive Applications
4(1)
Back-end Applications
5(1)
The Testing Methodology in Context
6(1)
Benchmarking
6(1)
Profiling
7(1)
Tuning
8(1)
How to use this Book
8(1)
The Book Organization
9(2)
The Testing Methodology
11(24)
Methodology Overview
12(1)
Establishing Performance Criteria
12(1)
Simulating Application Usage
13(8)
Defining Test Scripts
13(1)
Usage Profiles
14(2)
Test Data
16(1)
Realistic Usage Patterns
16(1)
Think Time
17(1)
Using the Real Think Time
18(2)
Using Zero Think Time
20(1)
Sampling Methods
21(2)
The Cycle Method
21(1)
The Snapshot Method
22(1)
Exclusion of Data
22(1)
Performance Statistics
23(7)
Response Time
23(2)
Average Response Time (ART)
25(1)
Aggregate Average Response Time
26(1)
Maximum Average Response Time
26(1)
Throughput
27(1)
Assessing the Accuracy of Test Results
28(1)
Quality of a Sample
28(2)
The Performance Tests
30(2)
Preliminary tests
30(1)
The Baseline Case
30(1)
Test Environment Optimizations
31(1)
Single Instance Stress Tests
31(1)
Endurance Tests
31(1)
Architectural Tests
32(1)
Summary
32(3)
The Grinder
35(52)
Where to Obtain the Grinder
36(1)
An Overview of the Grinder
36(6)
Load Generation
37(1)
Test Definition
38(2)
Statistics Recording
40(1)
The Console
40(2)
Getting started
42(3)
Using the Grinder Console
45(6)
Communication Settings
45(1)
Starting the Console
46(2)
A First Run
48(3)
The Console Recording Model
51(1)
Using the HTTP Plug-in
51(9)
Defining HTTP Requests
51(1)
Checking the Response
52(1)
Modeling a Web Browser Session
53(1)
Cookies and Sessions
53(1)
String Beans
54(2)
Advanced String Beans
56(2)
Miscellaneous HTTP Plug-in Properties
58(1)
HTTPClient
58(1)
HTTPS
59(1)
Writing a Grinder Plug-in
60(12)
Designing Your Plug-in
61(1)
JMS Plug-in Properties
62(1)
JMS Plug-in Per-test Properties
62(1)
The Grinder Plug-in SPI
63(1)
The Two Key Interfaces
63(2)
The Context Objects
65(1)
The Test Interface
65(1)
The JMS Queue Sender Plug-in
66(5)
Additional Statistics
71(1)
Timing
72(3)
Properties That Control Timing
73(1)
Timing Issues
74(1)
Using the TCP Sniffer to Create Test Scripts
75(8)
Running the TCP Sniffer
76(1)
The Sniffer in Action
76(5)
The TCP Sniffer as a Debugging Tool
81(1)
Installing the TCP Sniffer as a Browser Proxy
81(2)
Hints and Tips
83(1)
Use a Shared Disk for grinder properties
83(1)
More Than One Test Case
83(1)
Reducing Network Usage
83(1)
The History and Future of the Grinder
83(2)
The Grinder 3
84(1)
Summary
85(2)
Application Case Studies
87(78)
Choosing a JVM for the lava Pet Store
88(6)
JVM Tuning
94(1)
The e-Pizza Application
94(60)
Application Architecture
95(2)
Applying the Methodology
97(1)
Defining Performance Metrics
98(1)
Usage Profiles
98(1)
The Grinder String Bean
99(3)
The getPhone Method
102(1)
The getNewPhone Method
102(1)
Test Scripts
102(3)
The Registered Customer Test Script
105(3)
The New Customer Test Script
108(2)
The Test Environment
110(1)
Selecting the Sampling Method
111(1)
Setting up and Running the Tests
112(1)
Setting up the Database Schema
112(1)
Configuring and Starting WebLogic Server
113(2)
Setting up the Grinder
115(3)
Starting the Test
118(1)
Stopping the Test
119(1)
Analyzing the Results
119(2)
Preliminary Tests
121(1)
A Note on WebLogic Execute Threads
122(3)
Exploring the Limit Case 123
The Baseline Case 124
The Test Plan
125(1)
Selecting a JVM
125(1)
JDK 1.3.1-b24 HotSpot Client
125(3)
JDK 1.3.1-b24 HotSpot Server
128(2)
JDK 1.3.1-b24 Classic
130(1)
Making the Choice
131(1)
Single Instance Stress Test
131(1)
100 Users (Baseline Case)
131(2)
200 Users
133(1)
320 Users
134(1)
400 Users
135(3)
Beyond the Limit - 440 Users
138(2)
Endurance Test
140(2)
Using JDK 1.3.1-b24 HotSpot Server
142(1)
Optimizing the Number of Execute Threads
143(1)
Improving the Performance of E-Pizza
144(1)
SQL Tuning
144(1)
Timed_Statistics
145(1)
Using SQL_TRACE and TKPROF with e-pizza
146(1)
Use of Bind Variables
147(1)
Parsing and Executing Statements
148(1)
The Commit of the Pizza Order
149(3)
Other Database-Related Tuning Suggestions
152(1)
Application Code and Architecture Refactoring
152(1)
Consistent Use of the Session Facade Pattern
152(1)
Use CMP Entity EJBs
153(1)
OID Generation
154(1)
Performance Tests Using a WebLogic Cluster
154(9)
Cluster of 2 - 100 Users
156(1)
Cluster of 2 - 400 Users
156(2)
Cluster of 2 - 600 Users
158(2)
Cluster of 3 - 400 Users
160(1)
Cluster of 3 - 600 Users
160(3)
Conclusion
163(2)
HTTP and Servlets
165(80)
A Brief Overview
166(1)
Designing Servlets for Performance
167(4)
Using the init() Method for Caching
167(1)
Choosing a Session Mechanism
168(1)
Managing the Servlet Thread Pool
169(1)
Closing Resources
170(1)
Disabling Automatic Reloading of Servlets
170(1)
Test Servlets and Test Scripts
171(3)
Defining Performance Metrics
172(1)
AART
173(1)
Throughput: Transactions Per Second (TPS)
173(1)
Selecting the Sampling Method
173(1)
The Test Environment
174(1)
Setting Up and Running the Tests
175(6)
Configuring and Starting WebLogic Server
175(2)
Setting Up the Grinder
177(1)
Collecting the Data
178(1)
Analyzing the Data
179(2)
Preliminary Tests
181(15)
The RandomBytesServlet Test Servlet
181(4)
An Initial Test with Think Times
185(4)
The Baseline Case - No Think Time
189(2)
Test Environment Optimizations
191(1)
TCP/IP Tuning
191(1)
JVMs and Heap Space
192(2)
WebLogic Server Execute Threads
194(1)
WebLogic Performance Pack
195(1)
The Test Plan
196(31)
The Cost of Maintaining a HTTP Log
197(2)
The Effect of the Response Size
199(2)
Different HTTP Protocol Options
201(3)
Using the Servlet Compression Filter
204(1)
GZIPServletFilter
204(2)
Running the Tests
206(2)
Managing Session State
208(1)
HTTP Session Objects
209(1)
AlterSessionServlet
209(6)
2KB HTTP Session Object
215(2)
5KB HTTP Session Object
217(1)
The Effect of Session Object Size
218(1)
HTTP Session Object vs Stateful EJB
219(1)
The Session Servlet
219(4)
Running the Tests
223(2)
Using ejbRemove
225(2)
Conclusion
227(1)
Servlet Clustering
227(14)
Replication
228(1)
The Test Environment
229(1)
Setting up and Running the Tests
230(1)
Setting up the Database Schema
230(1)
Configuring and Starting WebLogic Server
230(2)
Selecting the Replication Technique
232(1)
Single Object Case
233(1)
No Replication
234(1)
In-Memory Replication
235(2)
Comparing File and Database Persistence
237(3)
Cookie Replication
240(1)
Multiple Objects Case
241(1)
Summary
241(4)
EJB Design Patterns
245(92)
Design Patterns
246(4)
EJB Design Patterns
246(1)
EJB Design Pattern Sources
247(1)
Three Fundamental Patterns
248(1)
Why Evaluate EJB Design Patterns and Performance?
249(1)
Better EJB Design through Testing
250(1)
Application Scenario
250(8)
Test Harness Design
251(1)
EJB Test Cases
252(2)
Running the DispatcherServlet Manually
254(4)
Test Scripts
258(1)
Test Configuration
259(1)
Setting up and Running the Tests
260(1)
Testing the Session Facade Pattern
261(10)
The Session Facade Pattern in a Nutshell
261(2)
Test Case 5.1.1: FacadeOffTest
263(1)
Test Results Summary
264(2)
Test Case 5.1.2: FacadeOnTest
266(1)
Local or Remote Interfaces?
267(2)
Using the Data Transfer HashMap Pattern
269(1)
Test Results Summary
270(1)
Testing the Value Object Pattern
271(22)
The Value Object Pattern in a Nutshell
272(2)
Test Case 5.2.1: VoOffTest
274(1)
Test Results Summary
275(2)
Test Case 5.2.2: VoOnTest
277(1)
Test Results Summary
278(2)
More Interesting Value Object Tests
280(1)
Test Case 5.2.3: VoTxTest
281(2)
Test Results Summary
283(5)
Test Case 5.2.4: VoFacadeTest
288(1)
Test Results Summary
289(4)
Testing the Data Access Object Pattern
293(20)
The Data Access Object Pattern in a Nutshell
295(2)
Test Case 5.3.1: DaoOffTest
297(1)
Test Results Summary
298(2)
Test Case 5.3.2: DaoOnTest
300(1)
Test Results Summary
301(2)
More Interesting Data Access Object Tests
303(1)
Test Case 5.3.3: DaoBmpTest
304(3)
Test Results Summary
307(1)
Test Case 5.3.4: DaoCmrTest
308(1)
Test Results Summary
309(1)
Test Case 5.3.5: DaoSbTest
310(2)
Test Results Summary
312(1)
EJB Deployment Configuration Testing Strategy
313(21)
Testing EJB Deployment Alternatives
318(2)
The DeploymentTest Script
320(1)
Transaction Isolation and EJB Performance
321(2)
Test Case 5.4.1: DdlsoTSTest
323(1)
Test Results Summary
323(1)
Test Case 5.4.2: DdlsoTRCTest
324(1)
Test Results Summary
325(2)
Access Control and EJB Performance
327(1)
Test Case 5.5.1: DdSecOffTest
327(1)
Test Results Summary
328(2)
Test Case 5.5.2: DdSecOnTest
330(1)
Test Results Summary
331(3)
Summary
334(3)
JMS Point-to-Point Messaging
337(58)
An Overview of JMS Point-to-Point Messaging
338(3)
JMS PTP Architectures
339(2)
Performance Metrics
341(1)
JMS PTP Performance Issues
342(2)
Acknowledgement Modes
343(1)
Persistence
343(1)
The Test Configuration
344(1)
Sampling Metrics
345(1)
The Grinder Plug-Ins and Test Scripts
345(3)
Producer Test Script
346(1)
Consumer Test Script
347(1)
Setting up and Running the Tests
348(8)
Setting up the Persistent Store
348(1)
Configuring and Starting WebLogic Server
349(2)
Setting the Type of Persistence
351(2)
Setting up the Grinders
353(2)
Collecting and Analyzing the Data
355(1)
Fan-out Tests
356(29)
Preliminary Tests
356(1)
The Baseline Case (l6KB Messages, 100 Consumers)
357(1)
Test Environment Optimizations
358(1)
The Test Plan
359(1)
Tests with 16 KB Messages
360(1)
Investigating Acknowledgment
360(6)
Using File Persistence
366(5)
Using Database persistence
371(4)
Tests with 1KB Messages
375(1)
Investigating Acknowledgment
376(4)
Using File Persistence
380(3)
Using Database Persistence
383(2)
Overall Conclusions for JMS Fan-0ut
385(3)
The Effect of Message Size
385(1)
Acknowledgment Modes
385(1)
Persistence
386(2)
Multiple Queues
388(4)
Test Case 6.19 - 2 Queues, 16KB Messages
388(2)
Test Case 6.20 - 2 Queues, 1KB Messages
390(1)
Test Case 6.21 - 32 Queues, l6KB Messages
391(1)
Test Case 6.10 - 32 Queues, 1KB Messages
391(1)
Summary of Results
392(1)
Conclusions
392(3)
JMS Publish/Subscribe Messaging
395(34)
An Overview of Pub/Sub Messaging
396(3)
Message Delivery
397(1)
Pub/Sub via Multicast
398(1)
Application Architecture
399(1)
The Grinder Plug-Ins and Test Scripts
400(6)
Publisher Plug-in Parameters
402(1)
Subscriber Plug-in Parameters
402(2)
Publisher Test Script
404(1)
Subscriber Test Script
405(1)
The Stock Ticker Application
406(9)
Testing Metrics
406(1)
Preliminary Tests
407(1)
Test Environment Optimizations
408(3)
The Test Plan
411(1)
No Acknowledgment versus Multicast
411(4)
Conclusions
415(1)
The Airline Seating Example
416(11)
Testing Metrics
416(1)
The Test Plan
417(1)
Investigating Acknowledgment
417(3)
File Persistence with Synchronous Writes
420(2)
File Persistence with Non-Synchronous Writes
422(1)
Auto Mode
422(1)
Client Mode
423(1)
Summary of Results
424(1)
Database Persistence
425(1)
Auto Mode
425(1)
Conclusions
426(1)
Summary
427(2)
Appendix A: The Grinder Reference 429(12)
Standard properties
429(4)
HTTP Plug-in properties
433(2)
Plug-in properties
433(1)
Test properties
434(1)
The Console Controls
435(6)
File Menu
435(1)
Action Menu
436(1)
Help menu
436(1)
Options Dialog
437(4)
Appendix B: Hardware and Software 441(4)
Hardware
441(1)
Software
442(3)
Appendix C: Comparing ECperf and the Testing Methodology 445(2)
Definition and Architecture
447(3)
Process
447(2)
Strengths and Weaknesses
449

An electronic version of this book is available through VitalSource.

This book is viewable on PC, Mac, iPhone, iPad, iPod Touch, and most smartphones.

By purchasing, you will be able to view this book online, as well as download it, for the chosen number of days.

Digital License

You are licensing a digital product for a set duration. Durations are set forth in the product description, with "Lifetime" typically meaning five (5) years of online access and permanent download to a supported device. All licenses are non-transferable.

More details can be found here.

A downloadable version of this book is available through the eCampus Reader or compatible Adobe readers.

Applications are available on iOS, Android, PC, Mac, and Windows Mobile platforms.

Please view the compatibility matrix prior to purchase.