Image Image Image Image Image Image Image Image Image

Blog

nelso663

By

February 18, 2017

Querying the Collection of the British Museum for Propositional Objects

February 18, 2017 | By | No Comments

As I mentioned last month, one of the ideas of the semantic web is to render data from specialized, disparate sources comparable, and this is achieved by developing specifications like CIDOC-CRM. One implementation of CIDOC-CRM is the Erlangen CRM. Heritage institutions like the British Museum use implementations like this to organize their collection. It is implemented in the Web Ontology Language (OWL) and can be browsed in an ontology explorer like Protégé or by just reading the XML.

The CIDOC-CRM includes a class called Conceptual Object. Conceptual Object is a subclass of Man-Made Thing and a superclass of both Propositional Object and Symbolic Object. I’m particularly interested in exploring the Propositional Object class, which includes

“immaterial items, including but not limited to stories, plots, procedural prescriptions, algorithms, laws of physics or images that are, or represent in some sense, sets of propositions about real or imaginary things and that are documented as single units or serve as topic of discourse” (CIDOC-CRM, n.d.).

According to the documentation, a set of exemplary instances of this class are the common plot points of Kurosawa’s The Seven Samurai and Sturges’ The Magnificent Seven. A query to a SPARQL endpoint in order to materialize that collection’s Propositional Objects might read as follows:

# declare a prefix
# this allows us to refer to objects in the schema directly rather than by their full URI
# e.g., in the query below, crm:E89_Propositional_Object rather than the full URI http://erlangen-crm.org/current/E89_Propositional_Object
PREFIX crm: <http://erlangen-crm.org/current/>

# specify:
# a) the variable that the server should return (?instance)
# b) that the server should return unique instances only (with the DISTINCT modifier)
SELECT DISTINCT ?instance
# specify the pattern for the server to try to match
WHERE { 
 ?instance a crm:E89_Propositional_Object 
}
# state how the response should be ordered…
ORDER BY ?instance
# and the quantity of instances to limit the response to
LIMIT 100

Applying this query to the British Museum’s SPARQL endpoint returns 100 instances of Propositional Object, including Afghan Studies, Annual Reports, and Annual Review of the Royal Inscriptions of Mesopotamia Project.

Find the British Museum’s SPARQL endpoint and some helpful examples here.

Jessica Yann

By

February 17, 2017

Timeglider JS: moving right along

February 17, 2017 | By | No Comments

Construction of my timeline project is moving right along.  I have almost completely entered in all of the basic events, and have formatted the website into what it will basically look like. It is really coming together! I am using Timeglider JS as the framework for the timeline portion of my project and coding the rest of the pages with html/css. So far, it has been pretty easy to manipulate the basic components of Timeglider to enter in my own data points and re-do the icons (I’m pretty proud of my legend).  It has definitely been a learning process, but I think it will do what I want. Assuming I keep all my commas where they are supposed to be.

While the content is not yet as complete as it will be by the end of the project, I welcome feedback (just understand that nothing is yet in its final version!). You can view my timeline here.     Perhaps more importantly, I need a catchy title! Timeline of Michigan Archaeology is just too long. What do you think, internet? Take a peak through the site, then give me your feedback.  If I choose your title, I’ll give you an acknowledgement on my page! 🙂

 

 

mahnkes1

By

February 9, 2017

Building the Project Narrative

February 9, 2017 | By | No Comments

As my project starts to move into a more intelligible form, I’d like to share a few of the new features on the beginning pages. Initially, my plan was to focus on three waves of Filipinx immigrants and where they settled in Michigan. Each wave would have its own page, showcasing movement and settlement. However, it meant an extensive pursuit of data, and to make room for time constraints and limited skill, I resolved to focus only on post-1965 groups since they seemed to be potentially informative for contemporary concerns of displacement and urban planning.

I’ve settled on two current Filipinx and Asian American spatially representative sites, and have started wrapping up analysis on the impact of one of them, but something about the accumulating narrative of the site still fell short for me. The pictures and stories of the first groups of APA immigrants kept coming back, providing a fuller arc in the discussion of what it means to be a citizen, and I realized this would be an important underlying consideration as users explore the later pages about continuous efforts to carve out space for cultures.

As a result, I created a beginning page with maps highlighting some of the first Filipinx immigrants’ residences in Ann Arbor and Detroit. By some stroke of luck, I managed to create a toggling button for seeing map layers of these residences by decade. Users would ideally be able to click on specific decades, gradually populating the map with the general areas of initial settlement. Markers are also written with popups that reveal the name, year registered in the Bureau of Insular Affairs, address, major/job, and school. If my luck persists, I hope to also overlay the maps with circled areas that represent urban development affecting residential areas. Populating these maps will take some time and the data will be nowhere near exhaustive, but it will provide an interesting portrait of the general areas of immigrant settlement.

Autumn Beyer

By

February 9, 2017

Capturing Campus Cuisine: User Interaction

February 9, 2017 | By | No Comments

Following up on my previous blog about choosing an MSU theme for the Capturing Campus Cuisine webpage, this post will focus on the user interaction and experience. While the major sections of the webpage of this project had been previously decided, I was still not completely sure how I wanted the users to move through and interact with the site. After discussion with my partner on this project, Susan Kooiman, and the director of the Campus Archaeology Program, we decided to have the headers of the sections organized going from the themes of food practices, to our research methods used to learn about the various food practices, then the complete meal reconstruction conclusions, followed by the interactive atlas and additional resources. Read More

nesbit17

By

February 3, 2017

Slowly Building My Website

February 3, 2017 | By | No Comments

I’ve done away with Bootstrap and am giving it a go with HTML and CSS.  Everything is coming along… slowly but surely.  I wish I could globally change my sub-pages, but am not savvy enough to know how. Lots of copy/paste going on.  Still pondering a name for the website.  It’ll likely come to me in a dream. Hopefully.

pebbles1

By

February 3, 2017

Responsive Rhetoric

February 3, 2017 | By | No Comments

This week has been a hard one and the year has had a rocky start for me: I have been sick, and I am concerned about the recent news that overlaps with my community and research. President Trump is making way for Keystone XL and the Dakota Access pipeline (DAPL). Opposition to these two pipelines is the basis of the Native movements around Idle No More and Standing Rock (Mni Wiconi) which pushed for the stoppage of both Keystone XL and the Dakota Access pipeline. This is particularly discouraging as this action would threaten tribal sovereignty and break treaty law.

The result of this decision to allow the pipelines to move forward has yet to be seen. However, I am interested to see that protests occurred immediately after the announcement was made; the announcement was made two days ago and, since then, there were protests in New York two days ago and in Washington D.C. yesterday, and one in Minnesota today against the decision to encourage the continued development on these pipelines with hundreds of protesters at each event, despite it being part of the work week and extremely short notice to organize and react. This means that support is still strong and there is a clear alliance of the over 150 Indigenous Nations who support this movement as well as the millions of Americans who stand united with us.

The chairman of the Standing Rock tribe, David Archambault II, responded to President Trump’s permission for the Army Corp of Engineers to bypass the environmental analysis by writing:

Your Memorandum of January 24th instructs the Secretary of the Army to direct the Assistant Secretary for Civil Works and the US Army Corps of Engineers to review and expedite “requests for approvals to construct and operate the DAPL,” including easements. It also directs them to consider rescinding or modifying the Memo of December 4th, which calls for an Environmental Impact Statement and consideration of a reroute. There is more, but perhaps most astonishingly it calls for consideration of withdrawal of the Notice of Intent to prepare an EIS.

President Trump, the EIS is already underway. The comment period does not close until February 20th and the Department of the Army has already received tens of thousands of comments. This change in course is arbitrary and without justification; the law requires that changes in agency positions be backed by new circumstances or new evidence, not simply by the President’s whim. It makes it even more difficult when one considers the close personal ties you and your associates have had with Energy Transfer Partners and Sunoco.

Your memorandum issues these directives with the condition that these actions are carried out “to the extent permitted by law.” I would like to point out that the law now requires an Environmental Impact Statement. The USACE now lacks statutory authority to issue the easement because it has committed to the EIS process. Federal law, including the requirement of reasonable agency decision making, prevents that.

He continues to hold to Tribal and legal sovereignty with the following comments:
The problem with the Dakota Access pipeline is not that it involves development, but rather that it was deliberately and precariously placed without proper consultation with tribal governments. This memo takes further action to disregard tribal interests and the impacts of yesterday’s memorandums are not limited to the Standing Rock Sioux Tribe. This disregard for tribal diplomatic relations and the potential for national repercussions is utterly alarming.

This gives encouragement to the millions of people who are members of Tribal Nations and those who stand united with them. This unity is the strength of the movement, the nations, and the communities; may these voices continue to speak out and exercise their sovereignty and independence while encouraging considerate and thoughtful civility on the part of the U.S. Government and the Tribal Nations.

However, our survival is our resistance; our survivance is our voice, our sovereignty.  Life is basic part of nature and nature is the most basic of laws.  When life is threated by endangering life-giving, life-maintaining water, we resist to survive. We resist for our children and the next seven generations.  Our survivance is ongoing and we will not stay silent.

swayampr

By

February 2, 2017

Making maps talk…

February 2, 2017 | By | No Comments

When I look at a map, I want to know how it relates to the reality of the terrain. One of the things I learned during my Master’s in Urban Design was to use AutoCAD. I enjoyed being able to created detailed figure-grounds, especially tracing over archival maps. The challenge however was, how would I ensure that they were projected properly? It was all great to have a really (what I thought at least) pretty map in 2D, a whole exercise to actually have to project the right way. I unsuccessfully tried to use Rhino etc to create maps that were projected right. It was only last semester that I found out that one could geo-rectify maps super easily (there is a list of tutorials you can use at the end of this blog post)! There are range of softwares and website that help with georeferencing.                                                                                       

The question I guess is why is geo-referencing important for my project? What will it add?

The simple answer is that geo-referencing a 2D map (especially a 2D map) spatializes it in a far more real way than looking at it and comparing it with a globe/3D map etc.  Especially when it comes a historical map georeferencing lets the viewer get a better sense of what used to be and compare it to how things have changed.

Geoferencing, simply put works like this: the user identifies anchor/control points on both the 2D map and the properly projected map (often times archival maps have contour data and/or labels that can be helpful in figuring out these points), the user then marks those on both maps and voila! The software/website actually stretches the 2D map to match the projected map. Depending on the accuracy of both maps, the accuracy of the corrections and distortions will vary.

Georeferencing a historical map requires a knowledge of both the geography and the history of the place you are studying to ensure accuracy. The built and natural landscapes change over time, and it is important to confirm that the location of your control points — whether they be houses, intersections, or even towns — have remained constant. Entering control points in a GIS is easy, but behind the scenes, georeferencing uses complex transformation and compression processes. These are used to correct the distortions and inaccuracies found in many historical maps and stretch the maps so that they fit geographic coordinates.[1]

In a sense this treats the control points as tack pins that pin the historical map to a three dimensional surface.  For a project such as mine, a georeferenced map makes it easier to see the ways in which the planners of Norris planned the town. It makes relationships with the nearby dam and urban areas more clear. And it also gives the user the ability to look at what has changed and/or the difference between planning and implementation. For a user, a well done georeferenced map also makes the experience a lot more interactive and meaningful.

Learning georeferencing:

I must admit that at first I didn’t think I would be able to do it. So I test-tired a low-resolution map of Norris and it worked really well! Heartened by that, I am currently finishing up the high-res map georeferencing. My next hurdle is putting it on to the website (I am still figuring that out!). I shall share the link as soon as its done.

Some of the links I found particularly useful and easy are listed below:

http://programminghistorian.org/lessons/georeferencing-qgis

http://www.kristenmapes.com/georectifiedmap1/

http://history2016.doingdh.org/map-warper-tutorial/

Happy georeferencing!

[1] Jim Clifford et al,  Georeferencing in QGIS 2.0 (2013). Accessed December 25, 2016. http://programminghistorian.org/lessons/georeferencing-qgis.

Nikki Silva

By

January 31, 2017

How to Build the Directory of Oneota Scholars

January 31, 2017 | By | No Comments

In the past few weeks I have struggled to decide how I will build my database into my github pages site, without learning how to code SQL (structured query language), which would be difficult given the amount of time I have to complete this project. I was struggling with using Airtable as a front end developer and I think it will be easier to just create a template in the HTML for entries and fill in the information this way, while still using Airtable to house the information. I will pull from this file (like an online excel spreadsheet) to populate the directory. I will have the letters of the alphabet listed at the top of my Directory page, which will be anchored in the HTML to each section of scholars listed alphabetically by last name.

Read More

Jack Biggs

By

January 30, 2017

Wading Through Skeletal Aging Literature and Raphael.js

January 30, 2017 | By | No Comments

Apologies for my tardiness in posting!  It’s been an incredibly hectic semester so far and time keeps on slipping away!

Since my project is focused on subadult skeletal age estimation, I’ve really started going through the literature and publications over the subject.  On one hand, this is a great way for me to conduct in-depth literature reviews for my methods section of my comprehensive exams and prepare for my dissertation since I’m focusing on the growth and development of ancient Maya subadults and how social, environmental, and biological stressors affect those processes.  On the other hand, I’ve been running into a few academically-related brick walls.  One common thread I’ve found throughout most of the literature is that there really is no single-agreed upon method for most of the transitional age-related changes for any single bone or bone element.  Academia, especially Victorian and early 20th century academia when many of these studies originally took place, is full of researchers and their own opinions or just blasting other scholars’ methods.  Additionally, the racist roots of physical anthropology focused on non-white populations as a curiosity while only conducting comprehensive and in-depth studies of white European or American populations.  This is an unfortunate trend that extended embarrassingly far into the 20th century and was not until post-WWII that things began to change.

However, the vast majority of skeletal studies, as a result of large institutional collections, are still comprised of mostly white individuals which limits the degree of applicability for those studying cultures in non-white areas of the world where different cultures and environments greatly dictate development of the human skeleton.  As a result, the majority of the methods employed in this interactive website will be pulled from studies comprised of mostly white individuals, as those have most often been heavily researched.  For me, this is an unwanted convenience as it does not actually represent the full breadth of human variation that we see across populations and cultures across the globe.  (However, population-specific studies have become very popular and standards for specific regions and populations have increased, but not enough to the point yet to where I could effectively implement them into my website.)

An additional unwanted event occurred in which I accidentally discovered that another researcher is using the title ‘OSSA’ as well for his osteological software that statistically estimates ancestry.  It has not yet been published and is still in its beta test phase which is why nothing originally came up in my search for anyone else using that acronym.  Although I think I may technically finish my website before him and could thus use the name, this researcher is on my dissertation committee and I felt it wisest not to make them mad!  So for right now, the working title of the project is ‘Fontanelle’ in reference to the spaces on the cranium of incomplete closure on infants.

One last aspect of the project I’m working through is the interactive graphic on the landing page.  Ideally it will be a juvenile skeleton to where when the mouse hovers over a specific bone or element, such as the skull, the skull will change colors and clicking on it will take you to a new page specifically focusing on aging methods of the bones of the cranium and mandible.  I have been trying to accomplish this with raphael.js which allows you to draw vector graphics on webpages.  Since I have had a slower-than-expected start to the semester, I’m a little behind in my execution of being able to do this and it is still weighing me down.  I know that once I am able to map out a single bone, the rest of the skeleton should be relatively pain-free (although this might take a while with all the bones in the body!)

nelso663

By

January 29, 2017

Learning About Web 3.0

January 29, 2017 | By | No Comments

My project involves working with some of the technologies of the semantic web. The main idea of the semantic web (or web 3.0, and in Berners-Lee’s language the “read+write+execute” phase that will supersede the “read-only” phase of web 1.0 and the “read+write” phase of web 2.0) is for web services to reason automatically about resources. Robust descriptions enable the linking of heterogeneous resources. Semantic web services commonly use the Resource Description Framework (RDF) to represent entities in terms of subjects, objects, and predicates.

Take the hyperlink, for instance: where a classic hyperlink connects one document on the web to another, web 3.0 proposes to link data within a document to data within another. This proposition depends on data modeling work: specifying a domain’s things as categories (classification), identifying supersets and subsets of its classes (generalization), and specifying the part-whole relationships in which its classes participate (aggregation).

Where descriptions of resources are robust, we are able to build services to reason about them intelligently and automatically. Europeana, the European Union’s platform for heritage content, provides good documentation and maintains a SPARQL endpoint, which I’ve found helpful for learning about these technologies. See the video I’ve posted above for more information.