---------------------------------------------------------------------------- The Florida SunFlash Book Review: Benchmark Handbook For Database And TP Systems SunFLASH Vol 34 #15 September 1991 ---------------------------------------------------------------------------- This is from an article posted to the USENET newsgroup comp.databases. -johnj ---------------------------------------------------------------------------- From: morgan@unix.SRI.COM (Morgan Kaufmann) Subject: Announcing new benchmarking book Organization: SRI International, Menlo Park, CA (The Morgan Kaufmann Series in Data Management Systems, Jim Gray, Series Editor) THE BENCHMARK HANDBOOK FOR DATABASE AND TRANSACTION PROCESSING SYSTEMS, Edited by Jim Gray (Digital Equipment Corporation) With Omri Serlin, David DeWitt, Carolyn Turbyfill, Cyril Orji, Dina Bitton, R.G.G. Cattell, Patrick E. O'Neil, Tom Sawyer, Neal Nelson April 1991; 334 pages; cloth; ISBN 1-55860-59-7; $49.95 "Jim Gray has done a great job of pulling together some of the key players in this arena... The most obviously useful feature of the book is its inclusion of the full text and background explanation for the standard TPC-A and TPC-B benchmarks...I fully expect this book to become the standard reference in its field." -- C. J. Date "Jim Gray's comments on benchmarking and benchmark wars and Tom Sawyer's dry wit add immensely to the pleasure of reading this book...Omri Serlin's historical perspective should become a classic in this new field." -- Bill Highleyman (The Sombers Group) This practical handbook offers the reader a comprehensive view of benchmarking for modern transaction processing and database systems. Much of this information is available to readers for the first time. Each of the benchmarks presented is simple, scaleable, and has been successfully ported to several computer systems. Most importantly, the benchmarks can be used to measure entire systems from processor to network. Each benchmark is presented by a contributor who is intimately familiar with it. After the Introduction by Jim Gray, each chapter provides a tutorial on the benchmark, its rationale, and intended application area. The authors also describe typical pitfalls and performance reports obtained from using the benchmark. The volume concludes with advice from an experienced benchmark auditor who offers tools and a checklist of things to do when running a benchmark. TABLE OF CONTENTS Chapter 1 Introduction..........................................1 Jim Gray 1.1 The Need for Domain Specific Benchmarks....................3 1.2 Standards Bodies Defining Benchmarks.......................3 1.3 The Key Criteria for a Domain Specific Benchmark...........4 1.4 Domain-Specific Benchmarks for Database and Transaction Processing.................................................6 1.5 A Historical Perspective on DB and TP Benchmarks...........7 1.6 An Overview of the Benchmarks in this Handbook.............8 1.7 How To Use This Handbook..................................11 1.8 How To Use These Benchmarks...............................12 1.9 Further Reading...........................................13 1.10 Future Database and Performance Benchmarks...............15 References.....................................................16 Chapter 2 The History of DebitCredit and the TPC...............19 Omri Serlin 2.1 Interest in OLTP Performance..............................19 2.2 Anon et al. Publishes a Paper.............................21 2.3 Key Characteristics of a DebitCredit......................22 2.4 Early Efforts to Garner Industry Consensus................23 2.5 TP1 Variation Muddies the Waters..........................25 2.6 The Sawyer-Serlin Paper...................................26 2.7 The TPC Gets Launched.....................................27 2.8 TPC-A Official Results....................................32 2.9 TPC-B.....................................................35 2.10 Future Work..............................................35 2.11 Assessment...............................................36 2.12 Concluding Caveat........................................37 References.....................................................38 TPC-A..........................................................39 Clause 0: Preamble............................................42 Clause 1: Transaction and Terminal Profiles...................43 Clause 2: Transaction System Properties.......................45 Clause 3: Logical Database Design.............................51 Clause 4: Scaling Rules.......................................53 Clause 5: Distribution, Partitioning, & Message Generation....54 Clause 6: Response Time.......................................56 Clause 7: Duration of Test....................................59 Clause 8: SUT, Driver & Communications Definition.............60 Clause 9: Pricing.............................................66 Clause 10: Full Disclosure....................................70 Clause 11: Audit..............................................73 Appendix A: Sample Implementation.............................75 TPC-B..........................................................79 Clause 0: Preamble............................................83 Clause 1: Transaction Profile.................................84 Clause 2: Transaction System Properties.......................85 Clause 3: Logical Database Design.............................92 Clause 4: Scaling Rules.......................................94 Clause 5: Distribution, Partitioning, & Transaction Generation......................................95 Clause 6: Residence Time......................................97 Clause 7: Duration of Test....................................101 Clause 8: SUT Driver Definition...............................102 Clause 9: Pricing.............................................105 Clause 10: Full Disclosure....................................108 Clause 11: Audit..............................................112 Appendix A: Sample Implementation.............................114 Chapter 3 The Wisconsin Benchmark: Past, Present, & Future...119 3.1 Introduction...............................................119 3.2. An Overview of the Wisconsin Benchmark...................122 3.2.1 The Wisconsin Benchmark Relations..................122 3.2.2 The Wisconsin Benchmark Query Suite................122 3.3 Benchmarking Parallel Database Systems Using the Wisconsin Benchmark.................................................139 3.3.1 Speedup and Scaleup: Two Key Metrics for a Parallel Database System......................139 3.3.2 Using the Wisconsin Benchmark to Measure Speedup, Scaleup, and Sizeup....................142 3.3.3 Sizeup, Speedup, and Scaleup Experiments on the Gamma Prototype.................................143 3.4 Conclusions...............................................157 References.....................................................159 Appendix 1. Wisconsin Benchmark Queries for 10,000 Tuple Relations.................................162 Chapter 4 AS3AP: An ANSI SQL Standard Scaleable and Portable Benchmark for Relational Database Systems...........167 Carolyn Turbyfill, Cyril Orji, and Dina Bitton 4.1 Historical Perspective....................................167 4.2 A Scaleable Benchmark.....................................169 4.3 Benchmark Scope...........................................170 4.4 Systems Under Test........................................171 4.5 Measurements..............................................172 4.6 Test Database.............................................173 4.6.1 Database Generator.................................173 4.6.2 Structure of the Database..........................174 4.7 Single User Tests.........................................180 4.7.1 Operational Issues.................................181 4.7.2 Test Queries.......................................182 4.8 Multiuser Tests...........................................187 4.9 Summary...................................................190 4.10 Acknowledgements.........................................191 References.....................................................191 Appendix 1. SQL Schema........................................193 Appendix 2. Single-user Tests.................................196 Appendix 3. Run Sequence For Single-User Tests................203 Appendix 4. Multiuser Tests...................................206 Chapter 5 The Set Query Benchmark.............................209 Patrick O'Neil 5.1 Introduction to the Benchmark.............................209 5.1.1 Features of the Set Query Benchmark................211 5.1.2 Achieving Functional Coverage......................213 5.1.3 Running the Benchmark..............................218 5.2 An Application of the Benchmark...........................220 5.2.1 Hardware/Software Environment......................220 5.2.2 Statistics Gathered................................221 5.2.3 Rating DB2 and MODEL 204 in $/QPM..................233 5.3 How to Run the Set Query Benchmark........................234 5.3.1 Generating the Data................................234 5.3.2 Running the Queries: Buffer Flushing..............236 5.3.3 Interpreting the Results...........................238 5.4 Configuration and Reporting Requirements, a Checklist.....240 5.5 Concluding Remarks........................................241 Acknowledgements...............................................242 References.....................................................242 Appendix 1. Counts of Rows Reviewed by Queries................244 Chapter 6 An Engineering Database Benchmark...................247 R.G.G. Cattell 6.1 Introduction..............................................247 6.2 Engineering Database Performance..........................249 6.3 The Benchmark Database....................................250 6.4 Benchmark Measures........................................253 6.5 Benchmark Justification...................................257 6.5.1 Earlier Benchmark..................................257 6.5.2 HyperModel Benchmark...............................257 6.5.3 Summary of Engineering Database Benchmark Rationale259 6.6 Running OO1 and Reporting Results.........................259 6.7 Porting OO1 to Three DBMSs................................264 6.7.1 OODBMS.............................................264 6.7.2 RDBMS..............................................265 6.7.3 INDEX..............................................266 6.8 OO1 Measurements on Three DBMSs...........................268 6.9 Variations................................................272 6.10 Summary..................................................275 References.....................................................280 Appendix: SQL Definitions.....................................281 Chapter 7 The Neal Nelson Database Benchmark (TM).............283 Neal Nelson 7.1 Introduction..............................................283 7.2 Description of the Benchmark Operation and Methodology ...286 7.2.1 Running the Test...................................286 7.3 Description of the Database Design and Organization.......287 7.4 Sample Queries............................................288 7.5 Sample Test Results.......................................290 7.6 Observations and Findings from Customer Uses of the Benchmark.................................................293 7.6.1 Army Database......................................293 7.6.2 Retesting of an Existing Application...............295 7.7 Checklist To Help Determine the Validity of Database Benchmark Results.........................................296 7.8 Conclusion................................................299 Chapter 8 Doing Your Own Benchmark............................301 Tom Sawyer 8.1 Why Benchmark?............................................301 8.2 Fundamental Questions.....................................302 8.2.1 Programmer Time....................................302 8.2.2 Machine Resources..................................302 8.2.3 Management Time....................................303 8.2.4 Lost Opportunity...................................303 8.3 What Can Be Learned?......................................304 8.3.1 What You Will Not Learn............................304 8.4 Benchmark Specification -- An Extended Example............305 8.4.1 The Online Transactions............................306 8.4.2 Ad-Hoc Queries.....................................310 8.4.3 The Batch Jobs.....................................311 8.4.4 Error Checking.....................................312 8.4.5 Omissions..........................................312 8.4.6 A Caution..........................................312 8.5 The Measurement Environment...............................313 8.5.1 The Batch Environment..............................313 8.5.2 The Report.........................................314 8.5.3 The Update.........................................315 8.5.4 The Online Environment.............................319 8.6 Beware the War of Wizards.................................319 8.7 Summary...................................................319 Appendix Michael Franklin.....................................321 References.....................................................325 Index..........................................................331 THE BENCHMARK HANDBOOK FOR DATABASE AND TRANSACTION PROCESSING SYSTEMS, Edited by Jim Gray (Digital Equipment Corporation) April 1991; 334 pages; cloth; ISBN 1-55860-59-7; $49.95 Ordering Information: Shipping is available at cost, plus a nominal handling fee: In the U.S. and Canada, please add $3.50 for the first book and $2.50 for each additional for surface shipping; for surface shipments to all other areas, please add $6.50 for the first book and $3.50 for each additional book. Air shipment available outside North America for $45.00 on the first book, and $25.00 on each additional book. American Express, Master Card, Visa and personal checks drawn on US banks accepted. MORGAN KAUFMANN PUBLISHERS, INC. Department GR 2929 Campus Drive, Suite 260 San Mateo, CA 94403 USA Phone: (800) 745-7323 (in North America) (415) 578-9928 Fax: (415) 578-0672 email: morgan@unix.sri.com ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ For information send mail to info-sunflash@sunvice.East.Sun.COM. Subscription requests should be sent to sunflash-request@sunvice.East.Sun.COM. Archives are on solar.nova.edu and paris.cs.miami.edu. All prices, availability, and other statements relating to Sun or third party products are valid in the U.S. only. Please contact your local Sales Representative for details of pricing and product availability in your region. Descriptions of, or references to products or publications within SunFlash does not imply an endorsement of that product or publication by Sun Microsystems. John McLaughlin, SunFlash editor, flash@sunvice.East.Sun.COM. (305) 776-7770.