A Computerized Data Base for Lithic Use-Wear Analysis.

E.S. Lohse, D. Sammons

Abstract - Recent advances in digital imaging and construction of computerized data bases have great potential for the development of more sophisticated lithic use-wear analyses. We have applied commercially available image analysis packages to the study of stone tool assemblages. Use of high resolution digital imaging techniques greatly aids in the identification and recording of distinctive attributes and patterns of stone tool manufacture and wear. Importing images and selected measurements to standardized data bases and statistical packages, as well as automeasurement routines, greatly facilitates information transfer and analysis. This paper describes the system we have developed, identifies problems, and suggests future directions. To date, many problems have been practical, solvable technical ones focused on usable memory, transfer rates, and file sizes. More compelling problems are definition of data base structure and development of explicit terminology and standardized measurements indicative of current research. We have constructed an analytical framework in Visual dBASE. We are now developing a presentation that will train student analysts in use of our analytical system through simulations run on a highly interactive CD-ROM. This paper summarizes construction of that CD-ROM and outlines avenues for future work.

1 Introduction

Digital imaging technology is making dramatic inroads in archaeology. Cheaper, easier to use systems make imaging applications accessible to a broad range of researchers, empowering the rigor and potential of analysis, and promising major improvements in data base development and information transfer. Recent papers by Andresen and Madsen (1996), and Grace(1997), and Hinge (1996) characterize the state of archaeological data base building. Andresen and Madsen (1996) emphasize construction of relational data bases to facilitate recording, analysis, and presentation, citing use of Microsoft ACCESS as a sophisticated, low cost system, which enhances object-tracking (cf. Booch 1991). Grace (1997) has created an on-line hypertext version of his 1989 monograph on the quantification and computerisation of microwear (Grace 1989), and has added topically current discussions on expert systems and a good bibliography detailing work in computer databases and use of autoclassification systems. Hinge (1996) addresses the overriding problems of less expert users and definition of better questions and answers. He notes that most systems develop as prototypes or tentative systems. Factors in implementation include speed of development for these prototypes, lowering of costs, and participation of users in development. The goal is to allow greater interrogation of data sets without recourse to expert users or specialized knowledge. Module development is expected to become routine, and researchers will be urged to avoid the "spreadsheet mentality" that assumes that all data must be in the same table for adequate comparison. Hinge (1996) notes that a paramount stumbling block will be the increasing permeability of expert boundaries, which will cause considerable reluctance or perhaps animosity toward adoption of new analytical frameworks.

We have been using Visual dBASE for Windows for routine data base construction in our standard archaeological analyses (Lohse 1996). Borland’s dBASE has been a consistent choice for archaeological data base applications (Cf. Huggett 1992; or Lang 1992; or Eiteljorg 1995), although similar relational data bases which incorporate data, sound, and images have been developed by other software publishers. Common data structures in relational data bases include tables, indexes, queries, forms, reports and labels. The table is the basic unit of data management. Indexes speed access to records. Queries allow data to be selectively tagged and viewed in the context of specific questions. A primary strength of relational data bases is ready construction of forms that facilitate entry, editing and display of data. Menu bars and pull-downs allow excellent interfaces for the user. Data interchange and manipulation are also facilitated through use of the dBASE navigator. Customization is easy with selection of different icons and displays. This type of software has become an industry standard, is easy to use, and will probably continue to be easy to update in the future.

Our customized relational data base will tag the unique specimen number as a key to all other information. Users are able to reference the individual specimen by any specified attribute. User maneuvering is facilitated by organizing the data base in series of layers that logically incorporate the three primary data types of alphanumeric, cartographic and photographic. All compressed image files can be linked to all inquiries, and brought to view on request. Reports can be routinely produced by users, linking images and selected data fields. Links can logically range from artifact descriptions, to artifact classes, to assemblages, site profiles, level maps, and site maps to GIS displays of larger regions. The user can assemble edited data into new data bases that summarize or explore patterns from artifact to class to site to region. Data manipulation is quickly implemented at all levels.

We are committed to using Visual dBASE in development of the CD-ROM described in this paper. We are constantly upgrading our systems, however, and we are currently switching data entry to Microsoft ACCESS and Visual Foxpro as this paper goes into publication. This will not affect development of our teaching CD, since the basic entry form, the attribute names, and categories will be the same irrespective of particular data base management system, but our move to Foxpro should facilitate creation of databases that can be used in desk-top environments, on main frames, or on-line.

Practical problems in database design are myriad, but there are basic standards for archaeological fieldwork, analysis, reporting, collections management, and care, and use of these "building blocks" in database construction will allow creation of usable, up-datable information systems (cf. Arroyo-Bishop and Zarzosa 1992; or Hansen 1993). The practical concern is that data layers and different data bases have "hooks" that allow integration (e.g., specimen numbers). Choice of data base programs is varied but most researchers have moved to emphasize off-the-shelf applications like dBASE, ACCESS and Foxpro, lured by user-friendly programming, easy upgradability, and the likelihood of continued upgrades in the future. Systematic construction of the data base in a generic, powerful relational structure seems to be the key issue for usability and modifiability in the future. We are continuing to invest considerable effort and student training in developing computer databases, and acknowledge that in all likelihood change will be constant, as we continue to promote one application, with knowledge that we will shift to another more powerful, upgraded or convenient data base package.

2 Why a CD-ROM?

We have concluded that we must develop a training exercise for students that accurately portrays stone tool research and reflects the latest computer applications. Our research environment, like most, is strapped for funds, and it is hopeless for us to assume continual upgrades in equipment and software. Inevitably, we have mismatched pieces of equipment, marginal to acceptable computer hardware, and only selective software upgrades. What we have are collections to analyze and students to train. Lacking redundant sets of equipment, hardware and software, we have chosen to construct a highly interactive CD-ROM that effectively mirrors our restricted laboratory environment. Students will be trained in standard lecture and reading environments but then sent out with a CD that can be plugged into any contemporary PC environment, at home or on campus. By reading text overviews, and pursuing exercises using our forms, thorough glossaries, and excellent images of variable microscopic views, students will be able to experience our laboratory environment, before being seated at our microscopes, with attached hardware and software programs. The CD-ROM is then powerful experiential training that allows students to develop confidence and competence before tackling the actual task at hand.

The data base for lithic analysis described in this paper is being developed in CD-ROM format using Macomedia’s DIRECTOR and the Visual dBASE forms and tables developed for our standard lithic analyses. Pull-downs have been designed in DIRECTOR to mimic our dBASE forms. Five separate forms are included on the CD-ROM. One window contains a lexicon of terms designed to facilitate user learning. Another supplies JPEG images at variable magnification designed to illustrate terms in the pull-downs and in the lexicon. Users are directed to available images in the lexicon window. The screen, data structure, tables, lexicon and images can be printed to hard copy by the user.

3 The Database: theory and method to design

Our data base can be no better than its theoretical and methodological foundation. Research into stone tool manufacture and use has a long history, but transfer of research models to a computer format is still an iffy proposition, with more prototypes and experiments than finished tried-and-true products. Most data bases incorporating lithics treat tools and by-products of manufacture at a cursory, macroscopic level, commensurate with pragmatic work in culture resource management projects. The system described here has a narrowly proscribed application, focused on development of discrete behavioral interpretations for well defined archaeological contexts. It is not designed to be applied to all assemblages regardless of archaeological context. It is tedious and time-consuming to apply and needs to be used only in circumstances that promise value for the effort. Lohse (1996) lays out the relative costs and benefits of employing an analytical system like the one described here, and admonishes that critical prerequisites must be met. These include absolute provenience information with fine stratigraphic control, recognition of tight activity contexts within bounded cultural features, direct bagging of artifactual materials, and staged cleaning of artifacts through carefully controlled macroscopic and microscopic examination (Lohse 1994a; and Grace 1997: Section 2).

The protocol employed from collection through analysis to publication is presented in Lohse (1994b, and Lohse 1996). Once stone artifacts arrive in the laboratory, they are transferred to separate preparation and analysis trajectories dependent upon the nature of the collection (sorted by recovery context: dry-screening, wet-screening, direct bagging, or general matrix sample). All bagged matrix sample fractions are passed through flotation or dry screening in graduated geologic sieves. All artifacts above 2cm or those considered to have high diagnostic value are bagged separately, and entered into our database with a unique specimen number (e.g., FS 295-115) that keys the specimen to bag lot from a cultural or natural feature or level. The specimen number remains the constant reference for tracking the specimen through analysis and reporting.

Coarse excavation and recording procedures simply will nullify the acceptable bias for detailed microscopic analysis. Screen wear can probably be separated from use wear, but the removal of significant site matrix and residues before careful microscopic examination remains a central concern. Artifacts under magnification present a stratified activity context with discernible areas of manufacture and wear, and reveal associated layers of residue indicative of discrete uses. The ideal is examination of selected relatively dirty artifacts, which are to be carefully cleaned only as the analyst moves through an established protocol, characterizing diagnostic zones and layers.

The analytical framework per force emphasizes drawing samples from discrete prehistoric site activity contexts. Initially, all flaked stone tools are sorted and classified at the macroscopic level in a simple paradigmatic classification (Campbell 1984; and Lohse 1994a). Selected specimens are passed on to detailed microscopic examination for diagnostic attributes of manufacture and use (Lohse 1996). As diagnostic elements are defined, the specimen receives more and more detailed examination. Throughout this process, analysts maintain databases, create notes in memo fields, and tag high resolution images to specimen descriptions as appropriate for basic recording prior to cleaning, for future identification, or for publication.

4 Hardware and software

Visual examination is done with high resolution Quasar 8X video camera and an S-video CCTV camera mounted to a Nikon stereoscopic zoom microscopic with fiber optic illumination (see Lohse 1996 for a detailed description of equipment and procedures employed). The analyst works in a PC-Windows 95 environment, examining specimens on the monitor, entering measurements in the open databases, and capturing live digital images. Software utilized at this stage includes Visual dBASE and Image Pro-Plus. Images are compressed for storage without filtering. Enhancement or filtering routines are used only to enable measurement or to bring out highlighted features. All manipulated images are stored as image files separate from untreated record shots of the specimens taken at variable magnifications.

The move from macroscopic to microscopic examination at up to 180X entails a considerable shift in perspective for the analyst. The morphology of the flaked stone artifact becomes equivalent to an archaeological landscape, and the analyst strives for recognition of landmarks and pattern boundaries on surfaces that reference measurements from one magnification layer to the next. Use of variable power magnification zooms perspectives in and out as the analyst at first flies over terrain scanning for potential patterns such as attrition, residues or polishes. The analyst notes probable areas of interest, and then returns to examine and document separate "tools" on the surface of the artifact.

5 Database design: measurements and autoclassification

Directionality enters the analysis as the analyst makes decisions concerning image capture, resolution, magnification and enhancement. The analyst chooses variables based on interpretation of the pattern observed. Variable magnifications will be used. Filters may be applied that define patterns. Edge and surface landmark definition will be critical, as is the setting of intensity levels for pixel values. The result will be an information rich image or record, if not a particularly realistic or attractive image.

A digital image has been discretized in spatial coordinates and brightness values, with a matrix of row and column indexes that define points in the image (Gonzalez and Woods 1993 present a comprehensive overview of digital imaging). Corresponding matrix element values identify gray levels at specified points in the image. Elements of this digital array are called pixels or pels. Square arrays are constructed that have sizes and numbers of gray levels that are integer powers of two. A monochrome image is a two-dimensional light intensity function f(x,y) where x and y represent spatial coordinates and f is the value proportional to the brightness or gray level of the image at that point.

Our use wear analysis entails characterization of two major dimensions: edges and surfaces. Physical properties of different stones will range from opaque to translucent and nonreflective to reflective. This natural variation makes measurements tedious and forces adherence to a tightly structured protocol. Five separable steps are observed: image acquisition using an imaging sensor and digitization of the signal produced; processing, which includes techniques for enhancing contrast, removing noise, and isolating regions with texture indicative of alphanumeric information; segmentation, which partitions the image into its constituent parts or attributes; representation and description, where raw pixel data that constitutes boundaries or points is processed to highlight diagnostic features; and recognition and interpretation, which assigns a label to an object based on the information used to describe the image. Interpretation will assign meaning to the set of recognized objects through direct reference to the explicit analytical framework.

Once edge boundaries have been established, various line and area measurements can be made. Measurements may be taken automatically if resolution permits, and manually on a fairly consistent basis. All measurements are taken relative to screen pixel position (e.g., the number of pixels within an outline). The number of pixels included in the line or area measurement is then scaled and calibrated to any specified coordinate system (e.g., 1 pixel = 1 centimeter). Accurate calibration will also force adjustment of an aspect ratio that defines the relationship between the vertical and horizontal axes of the image. Accuracy will also entail rigid control over effective contrast and background flattening or subtraction. Intensity measurements are commonly used to define features and characterized surface or edges. For instance, the line profile shown in Figure 1 is scaled to 1 pixel = .001 cm and is recorded as a gray scale intensity calibration based on a standard optical density curve.


Figure 1. Image-Pro Plus screen capture showing pixelated image (800% screen) of 33X magnification of biface (FS 233-1). Upper right, line profile A’.

Use of automatic measurements is strongly conditioned by the character of the digitized image. Basic brightness, contrast, and Gamma values are critical to recording a good information rich image. Image-Pro Plus measurement routines are best applied in back-lit rather than reflected incident light environments (slide mounts rather than stage mounts), but many operations can be adapted for analysis of three-dimensional light surfaces. Pragmatic problems of effective source lighting and variable reflected light from shiny, asymmetrical surfaces plague microscopic study of stone tools. Digitization of the images and subsequent manipulation of the data is remarkably easy compared to the practical difficulties of manipulating lithic objects on stages and controlling the high reflectivity of facets set at myriad angles to the light source.

The line profile in Figure 1 accurately measures the undulating surface of a flake scar bounded by two arrises. This capability becomes useful when the analyst wants to apply mapping routines to the variable intensity values of the digitized tool surface as shown in Figure 2. In this example, Field Specimen No. 233-1 has been outlined with an autotrace tool. The count option in Image-Pro Plus was then used to outline and count the designated objects: basal flake scars. The automatic intensity option was selected and the line drawn automatically along the intensity gradient separating the objects from their backgrounds. Use of the watershed-split feature evaluates the two flake scars for potential splitting into separate objects. As shown, measurement of the intensity values of the pixels recording the flake’s variably lit surface defined six separable zones that correspond to the surface undulations of the flake scars.


Figure 2. Image-Pro Plus screen capture showing autotrace of the object outline, manual trace of two selected basal flake scars, and autotrace and split of the interior surface of the flake scars to derive six areas defining significant surface undulations (biface FS 233-1).

Other measurement techniques could be presented but the important point to emphasize here is the almost unlimited capability for measurement of here-to-fore difficult to define, if not impossible to measure, relations on the surfaces of stone tools.

6 Analyst training

Analysts are trained in traditional macroscopic and microscopic examination frameworks, receive practicums in the rudiments of knapping, and go through limited workshops in basic software applications (Lohse 1996). The CD-ROM described here is designed to facilitate analysts’ training in difficult protocols entailing microscopic examinations utilizing digital imaging. We intend to produce a training exercise that effectively imitates our microscopic system, and allows student analysts to develop a knowledge base sufficient to effectively utilize the system. The CD-ROM serves as a substitute for a lack of redundant sets of expensive equipment and software where students can begin to operationalize tenets of theory and method. After working through the exercise outlined here, the student analyst can be seated at the microscope and keyboard with prerequisite grounding for quickly and efficiently employing digital imaging routines.

7 Design for augmented learning

The instructional objectives of the lithic analysis CD-ROM are (1) for students to identify the various attributes relevant to functional or technological analysis and (2) for students to classify the characteristics of an attribute appropriately. For example, students will be able to identify eraillures and note their presence or absence; or, students will be able to locate the bulb of percussion and describe it as pronounced, moderate or weak.

The student analyst, before entering data into the Visual dBASE form, must first demonstrate the ability to classify different attributes correctly. The CD-ROM will assist students in defining and visualizing those attributes by providing a glossary of all terms used and digital images of appropriate examples that have been created especially for this exercise.

The CD-ROM will not literally be a Visual dBASE form but will mimic the actions of our data base forms. Designed with Macromedia DIRECTOR , the CD-ROM allows us to create an interactive, instructional form, utilizing all the same backgrounds and pull-downs that the student analyst will find in Visual dBASE itself. In addition, several explanatory pop-ups - the glossary, images, and the chalkboard - are available on the instructional CD-ROM to assist students in recognizing the attributes they will classify.

A completed DIRECTOR project is termed a "movie." The screen upon which all action appears is the "stage" and the various buttons, pull-downs, and images form the "cast" of the movie; each individual cast member is termed a "sprite." Sprites are placed and given direction upon the stage by use of the score, in which each sprite is given a specific location and animation (if appropriate) relative to each individual frame of the movie.

Animation and interactivity are easily established by the relationship of sprites to each other within the score and by placing markers within the score to move to other screens or even other DIRECTOR movies. The use of markers facilitates the multibranching aspect of DIRECTOR movies, which was more difficult in earlier versions; the multibranching ability of the current DIRECTOR 6.0 is ideal for the lithic analysis CD-ROM in which students need to be able to branch off to examine the glossary or images as they need clarification of a term. Multibranching or specialized paths can also be created by the author using Lingo.

In Figures 3-8, the DIRECTOR screen is replicated to indicate how sprites will appear and move upon the stage.


Figure 3. Analyst’s screen of TECHAN p.2, showing basic entry form.

In Figure 3, the Visual dBASE form for Technology Analysis, page 2 is the background upon which all other action will take place. Technically, although this form appears to be a background, it is actually Cast Member No. 1, and occupies the entire stage. Invisible buttons are placed over each term on the form so that if the mouse is clicked on a certain term, the pull-down of choices will appear. In data entry, the analyst is able to determine which of the terms in the pull-down is the correct one to characterize that attribute. For example, Platform Preparation is characterized as "Faceted, Dihedral, or Ground" (Figure 4).


Figure 4. Analyst’s screen of TECHAN p.2, showing Platform Preparation pull-down.

During data entry, the analyst will click on the correct choice and it will be automatically entered into the data base table. However, in the instructional CD-ROM, this pull-down is a second cast member or sprite and it also contains invisible buttons which link the terms in the pull-down to the glossary.


Figure 5. Analyst’ screen of TECHAN p.2, showing Glossary pull-down for faceted.

In Figure 5, the student has clicked on "Faceted," causing the Glossary to appear. The Glossary is a separate file which is hyperlinked to the terms on the form. Once in the glossary, the student may use the scroll bar to view other words: the student is not confined to the specific term which linked him to the glossary. Certain terms within the glossary contain one or two icons. The small microscope icon indicates that a digital image illustrating this term is available; the second icon, a small drawing on a board, indicates that a drawing is available as well.


Figure 6. Analyst’s screen of TECHAN p.2, showing image pop-ups at variable magnification.

In Figure 6, the microscope icon has been clicked, bringing up three images at various magnifications which illustrate a faceted, prepared platform.


Figure 7. Analyst’s screen of TECHAN p.2, showing chalkboard pop-up.

The chalkboard is also brought up (Figure 7), after the drawing icon in the glossary is activated. In each case, the glossary, chalkboard or images can be enlarged to fill the screen should the student wish a magnified view (Figure 8).


Figure 8. Analyst’s screen of TECHAN p.2, showing a selected image,chalkboard drawing and glossary entry.

Once open, the glossary, chalkboard, or images need to be closed before the student moves on to the next choice within the pull-down (in this case, "Dihedral") or to another term on the form.


Figure 9. Main menu for CD-ROM.

Figure 9 shows the opening screen for the CD-ROM with the various screen choices indicated by buttons. Our examples here (Figures 3-8) are from the Technological Analysis, page 2, form, but there are other sequences for TECHAN p.1 and Functional Analysis pages 1 and 2. Each of the different Visual dBASE forms have been scripted within a separate DIRECTOR movie. Therefore, the opening screen can be thought of as the main trunk, with each Visual dBASE form occupying a different branch. In addition, there is separate access to the bibliography, the glossary, and the file of images.

Macromedia DIRECTOR is a cross-platformed authoring program which is utilized to create stand-alone applications. That is, once the DIRECTOR movie is converted to a "projector," it can be played on any computer even if that computer does not have DIRECTOR loaded. This makes it an ideal authorizing package for us, where we would like students to be able to take the CD-ROM to any computer within the archaeology lab, the campus lab, or their home.

8 Prototype to implementation and improvement

We are currently developing a digital movie on reduction of obsidian cores and the manufacture and use of utilitarian and carefully designed obsidian tools. We have received modest funding to work with knappers to record well defined sequences of reduction. A high-speed video camera will be employed to slow down motion, and obtain views of the detachment of flakes from the core and the various applications of force. Another high resolution video camera will be used to record the scenes generally, offering better perspective for logistical layout. Using the editing capabilites of Adobe Premiere, we can then convert the footage to digital format and create both a long-running movie and smaller clips to be incorporated into our current CD-ROM presentation or in other on-line resources.

The insertion of digital movies into a DIRECTOR file is straightforward. Digital movies are imported as cast members and placed upon the stage, just as any other unanimated sprite. Special effect transitions create a seamless appearance for the digital movie. While the movie plays, the overriding DIRECTOR production is programmed to pause until the clip is completed. Although the inserted digital movie occupies only a single frame of the DIRECTOR production, the DIRECTOR score is directed to wait at that frame while the inserted movie plays. Without the instructions to pause, the DIRECTOR production would move immediately to the next frame, essentially eliminating the movie from view.

Students will access movie clips through the glossary, in the same way that they have accessed the digital images or the chalkboard. The movie camera icon will appear with relevant terms and will link those terms to the inserted movie clips. The insertion of the digital movies to the DIRECTOR production adds a further multimedia element to the instructional CD-ROM.

The technological application described here is a fun, creative exercise, and we think it can be used to greatly augment student learning. It is also a readily built upon and modifiable platform for future developments, whether this includes addition of movies or changes and additions to the analytical framework for stone tool analysis. The thorny part of our exercise has not been development of the mechanical prototype but difficulties in extracting agreed upon concepts and terms for stone tool analysis in the archaeological literature. Study of stone tool manufacture and use is, of course, commonplace in prehistoric archaeology, but the literature and knappers and replicators have reached very little agreement on standardization of concepts and terminology (cf. Grace 1989, and Grace 1996, and Grace 1997; and Hayden 1979; and Hurcombe 1992; and Inizan et al 1992; and Tixier et al 1980). Surveys of the extensive literature available, conversations and correspondence with researchers and knappers, have yielded little agreement. Knappers, in particular, will use sets of terms all but interchangeably, and often show little interest in development of a glossary. This is a paramount first step for us, however, since we intend to set up a workable standardized system of analysis in a computer environment. Moreover, we need to teach students within that system. For consistency of identifications, data entry, analysis, and information transmission, we have built a glossary of terms for stone tool analysis. This list has over four hundred terms defined, most with multiple definitions appropriate to specific contexts of recovery, analysis, and transmission. Terms are specific to the literature on stone tool analysis and encompass basic concepts integral to the software applications being used.

Another related struggle was our need to carefully select terms and develop applications for concepts as we try to apply theoretical and methodological perspectives developed for macroscopic analyses (less than 40X) to a microscopic view wherein the surface of stone tools and patterns of attrition, residue and polish become information layers at variable magnifications. This is a study in pattern recognition, of accurately mapping overlapping distributions, and of carefully recording register marks to accurately place the overlays. Stone tool analysis in this arena is not confined to older strictures of having to record attrition as flaking or polishing and looking to isolate significant attributes drawn from replicative or experimental studies. Careful microscopic inspection offers promise of accurately mapping and measuring stratigraphic sequences of attrition, polishing, and residue on stone tools. Knowledge of stone fracture given controlled applications of force is essential for establishing the basic artifact landscape before patterned use and attrition occurred. Directionality is created by accurately reconstructing the pristine tool landscape and following the erosion of high to low features and the deposition of residue. Arrises become hills, and flaking creates massive alteration of original surfaces. Facets of abrasion create linked planes across the landscape, and butt ends of flake scars and shallow basins fill with organic residues overlaying zones of manufacture and wear.

A principal impediment in the very recent past has been an inability to create, store and transmit high resolution images. Today’s digital imaging applications allow this on affordable table-top systems. Another problem has traditionally been accurate measurement and mapping of areas of wear and residue. Automeasurement techniques in the system used here, and available in many applications, greatly simplify accurate resolution, definition, and measurement of patterns on the surface of stone tools (cf. Grace 1997; and Grace et al 1985; and Lohse 1996). We are no longer limited to classification of attributes, variously defined, and may use distinctive attributes simply to isolate spots on the artifact landscape that merit more detailed inspection or as register marks to keep track of myriad overlapping data layers.

Our immediate hurdle is no less daunting than past limitations but the scenario has changed somewhat. Fingers might have pointed at technological limitations in the past. Now the fingers point back to the archaeologists to begin to resolve operational difficulties embedded in lack of standardized terms for analysis of stone tools, lack of agreement on significant attributes whether at the level of identification or measurement, and to adopt theoretical viewpoints with potential to better handle new technological potentials (cf. Grace 1993, and Grace 1996; and Rees et al 1991). Simply put, digital imaging systems have opened a new information rich world for lithic analysis and the old analytical frameworks must be redrawn. Classification based upon old standards is no longer a simple exercise. Debates over "blind tests" and replication results are perhaps not as central as once held. Application of enhanced measurement routines in digital imaging applications allows analysts to accurately map stratigraphic distributions on the surfaces of stone tools. A certain flake scar pattern indicative of a limited range of uses in particular media can now be located beneath organic residue in stratigraphic context commensurate to good site excavation. Interpretation is just as thorny as identifying activity surfaces on full-size sites, but lithic analysts can certainly postulate activity contexts on stone tools.

We need to retool vocabularies, attempt to apply new concepts, impose greater theoretical rigor, and begin to examine which parts of our established methodological and theoretical tool kit are appropriate in a computer environment. The potential is certainly high, but costs will be incurred in time spent on development of workable prototypes, and on myriad operationalizing decisions made to work efficiently within the computer environment. Our view switches from gross morphology on clean, washed stone tools, to microscopic examination of overlapping layers of wear and residue incorporating singular landmarks on dirty tools that can be cleaned only in carefully staged protocols designed to maximize information return. This cannot be an immediate development, and we doubt that anyone can be certain of the potential we assert here, nor can they blithely dismiss the possibility. When artifacts become sites, and arrises and shallow flake scars resemble ridges and valleys, we are nearing a significant jump in attainable analytical rigor. Is it worthwhile? Probably not in traditional narratives involving established plots of cultures and artifact types. The answer might be yes in the emphatic, however, if our goal is increasingly finer definition of human behavior in the past. In particular, the rigor may seem worthwhile if what we seek is accurate associations in site activity contexts, and if we want to attempt to recognize individual activities in the prehistoric past.

Our fun but very mundane first step has been to develop the analytical system briefly defined here. To teach students, in a typical archaeological laboratory lacking lots of good microscopes and enough terminals and programs, we have resorted to CD-ROM exercises for the classroom and home. When our prototype is completed, we will send out copies to individuals and institutions, asking for critique, and advice, whether at the applications and usability end of the scale or at the analytical end. Our sincere hope is that this exercise is attractive enough, and well enough thought out, to rivet some attention on its shortcomings. We are not solving a problem, only highlighting it, and we hope that the result is a contribution prompting more attention from more researchers. The CD-ROM will have "bells and whistles," the images will be great, but the glossary needs work, and the attributes selected for the pull-down menus can certainly be modified. In summary, we think its fun, creative, productive, and we need help in bringing a positive exercise forward.


Bibliography

Andresen, J and Madsen T,

Arroyo-Bishop, D and Lantada Zarzosa, M T,

Booch, G,

Campbell, S A (Ed),

Eiteljorg, H,

Gonzalez, R C and Woods R E,

Grace, R,

Grace, R,

Grace, R,

Grace, R,

Grace, R, Graham, I D G and Newcomer, M H,

Hansen, H J,

Hayden, B,

Hinge, P,

Huggett, J,

Huggett, J,

Hurcombe, L M,

Inizan, M L, Roche, H and Tixier, J,

Lang, N A R,

Lohse, E S,

Lohse, E S,

Lohse, E S,

Rees, D, Wilkinson G G, Grace, R and Orton, C R,

Tixier, J, Inizan, M L and Roche, H,