GIS


DDIG member Ethan Watrall (Asst. Professor of Anthropology @ MSU) sends us the following information about his upcoming Cultural Heritage Informatics (CHI) field school, which is part of the CHI Initiative at Michigan State University.

Excerpts quoted. For full details, please see this PDF LINK.

Site Link:<http://chi.matrix.msu.edu/fieldschool> Email:watrall@msu.edu

We are extremely happy to officially announce the Cultural Heritage Informatics Fieldschool (ANP491: Methods in Cultural Heritage Informatics). Taking place from May 31st to July 1st (2011) on the campus of Michigan State University, the Cultural Heritage Informatics Fieldschool will introduce students to the tools and techniques required to creatively apply information and computing technologies to cultural heritage materials and questions.

The Cultural Heritage Informatics Fieldschool is a unique experience that uses the model of an archaeological fieldschool (in which students come together for a period of 5 or 6 weeks to work on an archaeological site in order to learn how to do archaeology). Instead of working on an archaeological site, however, students in the Cultural Heritage Informatics Fieldschool will come together to collaboratively work on several cultural heritage informatics projects. In the process they will learn a great deal about what it takes to build applications and digital user experiences that serve the domain of cultural heritage – skills such as programming, user experience design, media design, project management, user centered design, digital storytelling, etc. …

The Cultural Heritage Informatics Fieldschool is open to both graduate students and undergraduates. There are no prerequisites (beyond an interest in the topic). Students from a wide variety of departments, programs, and disciplines are welcome. students are required to enroll for both sections 301 (3 credits) and 631 (3 credits) of ANP 491 (Methods in Cultural Heritage Informatics).

Admission to the Cultural Heritage Informatics Fieldschool is by application only.

To apply, please fill out the Cultural Heritage Informatics Fieldschool Application Form <http://chi.matrix.msu.edu/fieldschool/chi-fieldschool-application>. Applications are due no later than 5pm on March 14th. Students will be notified as to whether they have been accepted by March 25th.

A New York Times article describes how “the husband-and-wife team of Arlen F. Chase and Diane Z. Chase tried a new approach using airborne laser signals that penetrate the jungle cover and are reflected from the ground below. They yielded 3-D images of the site of ancient Caracol, in Belize, one of the great cities of the Maya lowlands.”

NYT lidar mapping in Belize

“In only four days, a twin-engine aircraft equipped with an advanced version of lidar (light detection and ranging) flew back and forth over the jungle and collected data surpassing the results of two and a half decades of on-the-ground mapping, the archaeologists said. After three weeks of laboratory processing, the almost 10 hours of laser measurements showed topographic detail over an area of 80 square miles, notably settlement patterns of grand architecture and modest house mounds, roadways and agricultural terraces.”

“[T]he primary financing of the project [came] from the little-known space archaeology program of the National Aeronautics and Space Administration. The flights were conducted by the National Science Foundation’s National Center for Airborne Laser Mapping, operated by the University of Florida and the University of California, Berkeley.”

“The Airborne Laser Terrain Mapper, as the specific advanced system is named, issued steady light pulses along 62 north-south flight lines and 60 east-west lines. This reached to what appeared to be the fringes of the city’s outer suburbs and most agricultural terraces, showing that the urban expanse encompassed at least 70 square miles.”

I’m happy to announce this year’s Open Archaeology Prize winner. This prize is annually awarded by a jury (in name of the Alexandria Archive Institute) to the best open-access, open-licensed, digital contribution to Near Eastern archaeology by an ASOR (American Schools of Oriental Research) member. The winning project, The West Bank and East Jerusalem Searchable Map, “includes lists of archaeological sites that have been surveyed or excavated since Israel occupied the West Bank and East Jerusalem in 1967. Since that time, the oversight of the antiquities of the area has devolved on two government bodies: the military administration’s Staff Officer for Archaeology (SOA) in Judea and Samaria and the Israel Antiquities Authority (IAA). The IAA, which is responsible for East Jerusalem, is a civil branch of government and its records are open for inspection. Some of the records of the Staff Officer for Archaeology in Judea and Samaria are being accessed in full for the first time as a result of the joint Israeli-Palestinian Archaeology Working Group. This involved a team of Israeli and a team of Palestinian archaeologists and cultural heritage professionals working in concert to create new data resources that document the single, unitary archaeological landscape of the southern Levant, which is now bisected by the modern borders.” “The data contained in this database is also available in a visually searchable Google Map interface.” It is an initiative of the University of Southern California, Tell Aviv University and the University of California, Los Angeles.

map_thumb

(cross-posted from the Heritage Bytes blog)

Reading the recent posts by Fennelle Miller and Kevin Schwarz got me to look into the spatial data a bit more closely. One of the issues that seems to crop up again and again is cost and complexity.

GIS data is still difficult to share dynamically over the Web, but things are changing. GoogleEarth, Google Maps, Open Layers, etc. provide great tools on the client side for viewing and interacting with spatial data (not just points too, but also vector lines and polygons). GoogleEarth and Google Maps are proprietary, but they are available as free downloads or free APIs. They also work with an XML format (KML) that is pretty simple, enjoys a wide user-community and can work with non-Google developed tools.

There are some tools for transforming the ubiquitous ESRI shape files into KML documents (the XML format used by Google’s applications for spatial data)(See this blog post at PerryGeo, see also the post’s comments). Here’s a link to some “how to” discussions on using PHP to read MapInfo (.mif) files to use with Google Maps. Here’s a link to an open source PHP class that reads ESRI shape files, the first step needed in converting them on a server to KML or other formats. The point of all this is that, with some development work, we can transform (to some degree at least) typical GIS data into formats work better on the Web.

Of course, GML (the community developed open standard) is a better choice for GIS data than KML. KML is needed for Google’s great and easy to use visualization tools, but GML is a much more robust standard for GIS data. GML also has the advantage of being an open, non-proprietary XML format. You’re not locked into any one software vendor and you have important data longevity advantages with GML. It should be noted that Open Layers (the open source equivalent of Google Maps) supports GML.

However, I’m not sure of the immediate need to go through all this effort. Sure it’s nice to have GIS data easily viewable on a web-browser or slick visualization tool like GoogleEarth. But the fundamentals of data access, longevity and discovery need to be in place first before we put lots of effort into online visualization.

Instead, we should look at some strategies to make our GIS data easier to find and maintain. And we need to approach the issue pragmatically, since overly complex or elaborate requirements will mean little community uptake. Perhaps we can look at ways of registering GIS datasets (ideally stored in GML) in online directories with some simple metadata (“information about information”). A dataset’s general location (say Lat / Lon point coordinates), some information about authorship, keywords, etc. and a stable link to download the full GIS dataset would be an excellent and simple start. Simple point data describing the general location of a project dataset will be enough to develop an easy map interface for users to find information about locations.

Such directories can be maintained by multiple organizations, and they can share/syndicate their content with tools such as GeoRSS feeds (RSS with geographic point data). It’s easy to develop aggregation services from such feed. You can also use something like Yahoo Pipes to process these feeds into KML formats for use in GoogleEarth! (We do that with Open Context, though it still needs some trouble shooting).

Also, Sean Gilles (with the Ancient World Mapping Center) is doing some fantastic work on “Mush” his project for processing of GeoRSS feeds. See this post and this post for details and exanples. Thus, simple tools like GeoRSS feeds we can contribute toward a low-cost distributed system that makes archaeological datasets much easier to find and discoverable with map-based interfaces and some types of spatial querying (such as buffers). This may be a good way to address some of Fennell Miller’s concerns about recovering and reusing all that hard-won geospatial data.

Of course, site security is an important issue, and finding ways of making our data as accessible as possible without endangering sites or sacred locations is important. I’m glad Kevin Schwarz raised the issue, and it’ll be very useful to learn more about how he and his colleagues are dealing with it.

         Fennelle makes a good point.  My impression is that agencies are often protective of their GIS data and may fear that wide disclosure will lead to people with nefarious purposes knowing where sites are located.  One of the frustrations (also an opportunity) is that through CRM investigations incredibly detailed GPS and GIS databases are often built-up about archaeological sites or regions, but there is no policy in place or architecture for capturing much of that data long-term.  For example, my firm often conducts GPS-based archaeological survey such that every artifact collected is associated with a GPS point (for example in a controlled surface collection).  But typically, agencies will only want one or a few GPS points for each site (or a shapefile with site boundaries).  A lot of these points are also, or could be tagged with information on stratigraphy, soils, slopes, groundcover, or prior distubance.  So aside from legacy data storage within your own firms’ archives there is no long-term organized effort to preserve the painstakingly collected data.  I am sure there are people in SHPO offices and elsewhere who would be interested in a broader-based archaeology GIS (currently state CR GISs work well but data collection/display is somewhat limited).                                                

          The possibility is that web-based and accessible formats could be used to store and make available archaeological data without compromising the need to secure certain kinds of data.  A collaborator of mine has written an XML data format that could be used to tag archaeological data in ways that could be read by various internet scripts.  It is pretty basic right now but it or something like it could make distributed GIS or GPS archaeology on the web more possible!  He and I also are collaborating on a webviewer that allows for analysis of spatial archaeological data within any webbrowser (he is the programmer not me!).   Both icon and  color-based intuitive analyses (Jacques Bertin’s visual variables) as well as results of quantitative analyses are available. I’ll post some more information on these ideas if anyone is interested in seeing it.

 

Kevin Schwarz

     

 

I have noticed that more and more federal agencies are requiring archaeological contractors to use GPS and GIS, but few of the agencies are then offering the contractors the GIS shapefiles to use in the field. Why are we documenting sites, features, artifacts with sub-meter accuracy and then using paper records to re-locate those same sites the next time out in the field?

I am hoping to persuade everyone to embrace the use of as many of the digital technologies as possible. I use GPS and GIS, but I do not own the hardware and software yet. However, they are the very next purchases I will make, as I consider them almost as important in doing business in 2007 as a computer and shovel.

Interested in hearing other people’s input on this topic. And I’m really interested in hearing what hardware and software people are using for GPS with real-time GIS data. ArcPad, right? Loaded onto what machine?

Fennelle Miller