Testing
Testing and timing have been an integral part of the LAPACK project. Software testing is required to verify new machine-specific versions. Software timing is needed to measure the efficiency of the LAPACK routines and to compare new algorithms and software. In both of these tasks, many vendors have helped us along the way by implementing basic routines on various machines and providing essential feedback (see Table 7).
The strategy we use may not be optimal for all machines. Our objective is to achieve a "best average" performance on the machines listed in Table 8. We are hoping, of course, that our strategy will also perform well
― 254 ―
|
Table 7. Vendor Participation | Alliant Computer Sys. | BBN Advanced Comp. | CONVEX Computer | Cray Computer | Cray Research | Digital Equipment Corp. | Encore Computer Corp. | FPS Computing | Fujitsu | Hitachi | IBM ECSEC Italy | Intel | Kendall Square Res. | MasPar | Myrias Research Corp. | NEC | Sequent Computer Sys. | Silicon Graphics | Stardent Computer | Sun Microsystems, Inc. | Supercomputer Sys., Inc. | Thinking Machines Corp. | |
|
Table 8. Target Machines (1-100 Processors) | Alliant FX/80 | IBM 3090/VF | BBN TC2000 | Multiflow | CONVEX C-2 | Myrias | CRAY-2 | NEC SX | CRAY Y-MP | RISC machines | Encore Multimax | Sequent Symmetry | Fujitsu VP | Stardent Computer | Hitachi S-820 | | |
on a wider range of machines, including the Intel iPSC, iWarp, MasPar, nCUBE, Thinking Machines, and Transputer-based computers.