December 2008

I got this in my inbox and thought DDIG readers might be interested (in case you didn’t know about it already):

2PM-4:30PM Pacific Standard Time (10PM-12:30AM GMT or Universal Time)
December 10, 2008
Location: Okapi Island
(You must have the free Second Life browser)

Join us for Burning Çatalhöyük, a project developed by OKAPI, the Berkeley Archaeologists at Çatalhöyük, and the UC Berkeley DeCal program. Çatalhöyük on OKAPI Island, in development since 2006, is an exploration of the past and present of a 9,000 year old site located in present-day Turkey. In this demonstration we intend to burn the existing models down in order to better understand the use of fire in Neolithic settlements. In consultation with fire experts Karl Harrison and Ruth Tringham, and architecture expert Burcu Tung, a team of undergraduate apprentices have replicated the burning sequence of Building 77, a structure excavated in the summer of 2008. OKAPI island also hosts reproductions of modern developments present at the site, including a water tower, Sadrettin’s café, the Chicken Shed and the nightly bonfire.

Remixing Activities:

Guided Tour of OKAPI Island by Ruth Tringham, (Professor of Anthropology, UC Berkeley, and Principal Investigator of Berkeley Archaeologists at Çatalhöyük) and the Remixing Çatalhöyük team.
Niema Razavian will introduce the work that the Fall 2008 Decal class has done on the island, and how this fits in with a broader UC Berkeley education.
Roland Saekow will demonstrate his teleportation system, to guide new visitors around the island.
Kira O’Connor will show the site datum she has constructed, and talk about how datums are used at archaeological sites in general.
Clark-Rossi Flores-Beyer will demonstrate the skeleton model he has managed to manipulate into a crouch position, in accordance with how people were buried at Çatalhöyük. He will briefly discuss burial practices in the settlement.
Garrett Wagner and Raechal Perez will discuss their own reproductions of the interiors at Çatalhöyük, and how they decided to configure the space on their own.
Colleen Morgan (UC Berkeley PhD Candidate, excavator at Çatalhöyük) will wrap-up the program with a discussion of why virtual reconstructions of archaeological sites are important, and what Second Life can do to increase our understanding of the past.

Second-Life recreation of Çatalhöyük

Second-Life recreation of Çatalhöyük

    What is Second Life?

Second Life is a 3-D virtual world created entirely by its residents. Okapi Island is owned and build by the OKAPI team (that’s us below!) and the Berkeley Archaeologists at Çatalhöyük.

Getting Started
To visit Okapi Island, you will need to create a user account and download the client software–both free.

To create an account, visit, click on Join (in the upper right corner) and follow the instructions. Note: You do not need a premium account to use Second Life or visit Okapi Island.

Next, download and install the Second Life client for your computer:

Launch the Second Life client and enter your password. You will likely begin in Orientation Island. To visit Okapi Island, click Map, enter “Okapi” in search field and click Search. Alternatively, you can click on the following slurl (second life url) in your browser, and you will be transported there:


After building out mostly idiosyncratic, departmental-level IT solutions for specific, outside-funded research projects, universities and other institutions of higher learning are now grappling with the expanding and changing demands put on them by their constituents: the academic research community.

The November-December issue, “Focusing on the Common Good for Higher Education,” of EDUCAUSE Review, a bimonthly magazine for the higher education IT community (freely accessible online), addresses these and other issues. It is a good read. Let me touch on some issues raised. Clifford Lynch (“The Institutional Challenges of Cyberinfrastructure and E-Research”) remarks how the advent of computer resources has fundamentally changed scholarly practice, from engineering to the humanities. The latter were the latecomers but often created the more ingenious and transformative applications. Beyond the hardware-oriented solutions, more and more effort has gone into “software-driven technologies such as high-performance data management, data analysis, mining and visualization, collaboration tools and environments, and large-scale simulation and modeling systems. Content, in the form of reusable and often very large datasets and databases—numeric, textual, visual—is an integral part of advanced information technology also.”

Development of the academic cyberinfrastructure

The cyberinfrastructure necessary for modern scientific research was at first built out by national institutions, e.g., the U.S. National Science Foundation. The prohibitive cost and scarcity of expertise made this approach the natural choice. In a second stage, individual research units within institutions of higher learning began to deploy specifically-tailored IT solutions for projects usually funded to a large extent by national funding organizations. As the need for collaboration between different institutions has grown together with the pace of communications—as in the larger society, I’d say—the need for interoperability and some type of openness has risen. In some fields, a professional organization took it upon them to establish depositories and the like to facilitate the exchange of ideas almost in real time, rather than via the old-style journals with their built-in time lag. In others, individual institutions stepped up to the plate. The fact remains that it is becoming more and more clear that campus-level infrastructures need to be built which can be used by all scholars, also the ones who aren’t able to obtain funding as easily and often don’t require specialized solutions anyway. A well-designed, easy-to-use institution-level cyberinfrastructure is becoming a must. Again though, care needs to be taken to ensure the easy connection with other institutions’ IT infrastructure. This all needs to be thought through in consultation: different institutions, funding organizations and countries of jurisdiction have different rules on how to deal with privacy issues regarding research data gathered from people and so on. It will also fall mainly on the IT services of institutions of higher learning to be responsible for reliable, secure storage with redundancy, for the longer duration. How long should one hold on to the ever-growing mountain of research data? The same data also will have to be online to the extent that a scholar can access it also when not at his campus office: so-called “cloud” computing. Virtual projects with collaborators spread out over many institutions need their data to reside in this “cloud.” Many challenges of implementation remain to be worked out.


In “Supporting the ‘Scholarship’ in E-Scholarship,” Christine L. Borgman advocates “e-scholarship,” i.e., “new forms of scholarship that are more information-intensive, data-intensive, distributed, collaborative, and multidisciplinary.” She states: “[a]lthough the data deluge presents the most immediate challenge for information technology strategy, academic planning, and research infrastructure, it is also the area of e-scholarship most subject to hype. Wired recently pronounced that science no longer needs theory, models, metadata, ontologies, or ‘the scientific method’: mining the data deluge replaces all of them.” I would call this the Google approach to research: if only one can find the perfect algorithm, all problems can be solved given enough data. This is of course naïve. Facts and observations do not exist in a vacuum, just like in particle physics, the act of observation changes the facts. For example, when an archaeologist excavates, he/she destroys the context. The data remain but cannot be replicated later. This is why the reasoning behind research strategies and the circumstances are so important. In the social sciences too, field or study data gathered from human subjects is unique, cannot be done over exactly. “E-scholarship, as a form of scholarship enabled by cyberinfrastructure, should be viewed as evolution more than revolution. The pace of that evolution varies widely within and between disciplines, campuses, and countries. Distributed and multidisciplinary collaborations are both facilitated and complicated by cyberinfrastructure. Similarly, the changing forms of information and the spreading data deluge offer not only a wealth of new research opportunities but also a daunting array of new challenges. Colleges and universities can minimize the challenges and maximize the opportunities by implementing campus cyberinfrastructure strategies that focus less on the technology per se and more on advances in scholarship and learning—that is, strategies supporting the ‘scholarship’ in e-scholarship.”

It goes without saying that the cyberinfrastructure challenges experienced, applications and solutions found by the academic world are instructive for any organization managing and spreading knowledge. There are lessons to be learnt, mistakes to be avoided.

Note: Cross-posted at

Wow! Here’s an interesting signal coming from the incoming Obama Administration. “” is now carrying a Creative Commons copyright license. According to the copyright policy on, they are using the Creative Commons attribution license. That’s the most open license Creative Commons offers, and is a great signal that at least some officials in the new Administration “get” the value of greater openness and freedom to use and reuse information. I hope this is a sign of greater sanity on issues such as defense of the public domain, transparency in government, the importance of fair use, and the need for greater openness in publicly financed scholarly and research communications.

Hat tip to my friend and colleague Jason Schultz over at the Samuelson Technology Law Clinic.

Update: The blog has a post describing the reasoning behind adopting the Creative Commons license. Interesting that they are highlighting a comment made by Lessig. It does indicate that the Obama team has some familiarity with the “Access to Knowledge” movement, its thinkers, and goals. A good sign!